Accepted paper
Optimization of Task Execution Period Using EvolutionaryTushar Kanti Dey1, Rahul Vishwakarma1 and Satyanand Vishwakarma2, 1I.I.T. Roorkee, India and 2N.I.T- Patna, IndiaAbstractIn a disk scheduling algorithm, the most important factor for achieving an optimized performance is simultaneously achieving two factors execution period of any request and missed tasks. Our work is motivated towards proposing such an algorithm. This paper aims at demonstrating the fact that the simultaneous optimization of completion time (execution period) and missed tasks produces an observable result for real time disk scheduling for applications which use large amount of data. The protocol is implemented using an evolutionary algorithm called multi objective genetic algorithm and our experimental results compares the performance of proposed protocol with existing scheduling algorithms. The result demonstrates that the proposed algorithm offers minimum execution period and missed task. |
Object Oriented Architectural and Design pattern for High Performance ComputingAbbas Mohammed1 and Niraj Updhyaya2, 1VIFCET Hyderabad, India and 2JBIT Hyderabad, IndiaAbstractThe processing power of CPU growing. The complexity of application not only depends on computational time (CPU consumption) but also Memory consumption (RAM consumption) and communication bandwidth consumption. Even if the CPU processing capacity is very high and if we are not able to feed the data or instructions in time CPU time will be wasted. To avoid waiting time of processing unit, we need to develop patterns which need less data movement between processor and memory and also loading the classes / objects. If the processing power as well as memory is low (for example mobile devices) also required to load less data in the memory. This paper discussed about an architectural and design pattern which requires less classes / objects be loaded for computation. |
Storage Capacity Forecasting Modelling through Simple Exponential Smoothening ApproachRahul Vishwakarma, Tata Consultancy Services Bangalore, IndiaAbstractStorage management is painful for the system administrators when it comes to last minute decision making when the capacity of the system is almost full. This can lead to serious impact on operations and budget for the business. So, there is a need of forecasting tool which can forecast the future date when the system is going to be full. Statistical predictive analysis has been implemented on the historical data and the proposed forecasting model has been validated by implementing the model on two systems helping system administrator to make wise decision after analysing the predictive strength of the model. |
Sociointercultural Evaluation for Investment Projects in Indigenous Communities WIXARIKASJose G. Vargas-Hernandez1, Ernesto Guerra-Garcia2 and Maria Eugenia Meza-Hernandez2, 1University of Guadalajara, Mexico and 2Universidad Autonoma Indigena de Mexico, MexicoAbstractThis paper analyzes aspects of the problem that occurs in the social evaluation of investment projects for indigenous communities' Wixarikas (Huichols). A project in this context make particularly complex the evaluation. On the socio-economic perspective with which it is evaluated comes into play the incommensurability of social and intercultural issues that cannot be ignored. It is addressed the questions that have arisen in the development of this type of project and presents a theoretical framework for the methodological proposal of socio-cultural evaluation. |
Urban Traffic Management System by VideomonitoringJose Raniery Ferreira Junior, Federal University of Alagoas, BrazilAbstractAs the big cities grow, it's more necessary the use of cameras for urban traffic monitoring. The increase in the number of vehicles on the streets makes the traffic, one of the largest metropolis problems, even more often to happen. To avoid this kind of issue, this paper proposes a management system by videomonitoring for the urban traffic. And the goal is to identify the vehicles e count them in period of time using Computer Vision and Image Processing techniques. |
Improving Face Recognition using Two-Dimensional Subspace AnalysisMr.Benouis Mohamed1, Senouci Mohamed1, Tlemsani Redouane2 and BenhmzaYounes Houari2, 1Universite d'oran, Algeria and 2INTTIC, AlgeriaAbstractIn this article we present an approach to face recognition based on the combination of feature extraction methods, namely two-dimensional DWT-2DPCA and DWT-2DLDA, with a neuronal network to improve a face recognition system in accuracy and computation time. |
Symbolic Classification of Traffic Video ShotsElham Dallalzadeh1, D.S. Guru2 and B.S Harish3, 1Islamic Azad University, Iran, 2University of Mysore, India and 3S J College of Engineering Mysore, IndiaAbstractIn this paper, we propose a symbolic approach for classification of traffic video shots based on their content. We propose to represent a traffic vid-eo shot by an interval valued feature vector. Unlike the conventional methods, the interval valued feature vector representation is able to preserve the varia-tions existing among the extracted features of a traffic video shot. Based on the proposed symbolic representation, we present a method of classifying traffic video shots. The proposed classification model makes use of a symbolic simi-larity measure to classify the traffic video shots. An experimentation is carried out on a benchmark traffic video database. Experimental results reveal the effi-cacy of the proposed symbolic classification model. Moreover, it achieves clas-sification within negligible time as it is based on a simple matching scheme. |
Color Image Compression Scheme with Reduced Computational ComplexityFituri Belgassem and Sabriya Elghanai, Higher Institute of Electronics, LibyaAbstractBlock Truncation Coding "BTC" is a simple and fast algorithm for coding digital images which achieves constant bit rate of 2 bits per pixel. The compression may be improved by coding only ahlf of the bits in the BTC bit plane of each block; the other half will be interpolated. The resulting bit rate will be 1.5 bits per pixel. A low computational complexity compression scheme for coding color images based on AMBTC is presented in this paper. Four techniques are employed in this compression scheme. They are quad tree segmentation, AMBTC bit plane omission, bit plane coding using 32 predefined visual patterns and one of the interpolative bit plane coding techniques. The algorithm has been investigated and applied to different still color images. The simulation results show that the scheme achieves an average bit rate of 0.385 bits per pixel for color images with an average PSNR of 30.71 dB |
Implementing it Security for Small Businesses within Limited ResourcesMargie S. Todd1, Syed (Shawon) M. Rahman and Sevki Erdogan, Capella University, USAAbstractThe purpose of this article is to present a comprehensive budget conscious security plan for smaller enterprises that lack security guidelines. We believe this paper will assist users to write an individualized security plan. We have also provided the top ten tools that are either free or affordable to get some sort semblance of security implemented. |
Keywords Extraction via Multi-relational Network ConstructionKai Lei, Hanhong Tang and Yifan Zeng, Peking University, ChinaAbstractKeywords extraction can be regarded as a process of ranking the words in a given document (set) according to their importance to this docu-ment (set). Previous graph-based methods usually consider only one kind of re-lation between words, such as co-occurrence, ignoring the fact that words in a text interact with each other via multiple relations, which collaborate to decide the importance of words. Although some recently published methods use more than one relation type, they fail to consider the interactions between re-lations. Therefore, we propose a new approach for keywords extraction by constructing a multi-relational network from texts, which evaluates the various relations at the same time. Experiments shows that our approach is competi-tive compared with some typical methods. |
Accelerating Super-Resolution Reconstruction Using GPU by CUDAToygar Akgun and Murat Gevrekci, ASELSAN, TurkeyAbstractThis paper demonstrates a massively multi-threaded implementation of super-resolution image formation on the NVIDIA CUDA architecture. On the algorithm side maximum a-posteriori (MAP) reconstruction is adopted with sub-pixel translational motion estimation algorithm for spatial resolution enhancement. Resulting algorithm is implemented in CUDA using a low end GT640 GPU, and an overall speed up of 5 - 6 times is achieved compared to ANSI C implementation running on a Core i5 CPU. |
Boundary Extraction In Texture Mosaics Using Discontinuity Information In Feature SpaceAli Ozturk1 and Ahmet Arslan2, 1KTO Karatay University, Turkey and 1Selcuk University, TurkeyAbstractIn this study, boundary extraction between textures is examined. The overall system consists of three stages. In the first stage, the gradients in feature space are estimated using a modified version of gray-level edge detection operators. For comparison purposes, both the Prewitt and Sobel operators are used. The second stage involves application of a threshold value to obatin a binary image displaying edges found in the first stage. In the last and crucial stage, some morphological post-processing operations are applied on the binary edge image to remove spurious pixels inside regions and to thin the thick edges occuring due to both rough thresholding and the use of large displacement value in edge detection. To discriminate between textures, four different features are used. The first three features are the fractal dimension (FD) of original image, constrast-strecthed image and top-hat transformed image, respectively. The fourth feature is the entropy which is a parameter obtained from the spatial gray-level co-occurrence matrix of the image. The experimental results are presented for mosaics with different number of textures from Brodatz album. |
A Tool for Analyzing Outbreak Detection AlgorithmsYasin Şahin, Hacettepe Universitesi, TurkeyAbstractDespite of the main objective of recent biosurveillance researches is bioterrorist attack threats, detection of natural outbreaks are also being tried to solve by governments all over the world. Such that, international foundations as WHO, OECD and EU publish public declaration about necessity of an international central surveillance system. Each data source and contagious disease carries its own patterns. Therefore, standardizing the process of outbreak detection cannot be applicable. Various methods have been analyzed and published on test results in biosurveillance researches. In general, these methods are the algorithms in literature of SPC and Machine Learning, although specific algorithms have been proposed like Early Aberration Reporting System (EARS) methods. Differences between published results show that, the characteristic of time series are tested with algorithm and the chosen parameters of this algorithm are also determine results. Our tool provides preprocessing of data; testing, analyzing and reporting on anomaly detection algorithms specialized at biosurveillance. These functionalities make it possible to use outputs for comparing algorithms and decision making. |
Flowchart based Programming Environments Aimed at NovicesDanial Hooshyar and Maen Alrashdan, APU, MalaysiaAbstractA weakness key for many novice programmers lays on their problem solving skills and analysis. This shortcoming is intensified at complexities associated with the development environment and the language syntax which novices are employed. A strategy that deals with the difficulties experienced by novice programmers in introductory programming courses is the one that reforms the teaching model, specifically within the context of technological support. One way of utilizing this strategy which is to employ visual programming languages, flowchart approach is the ionic one. Additionally, iconic programming environments traditionally endeavor to simplify the programming task by reducing the level of precision, and manual typing usually required in the conventional textual programming languages. These environments are trying to enhance the speed of problem-solving and implementation efforts take place. By focusing at this essay, is on the exploitation from flowchart as a visual aid in programming, all the systems covered are flowchart oriented. Since computer science instructors do not have access to a comprehensive survey of research in this area, this paper collects and classifies this literature, identifies important work and mediates it to computing educators and professional bodies. A search of relevant literature and the internet has revealed fifteen related systems developed from 1992 onwards and systems reviewed demonstrate the use of dynamic executable flowcharts by several authors to be an effective approach for teaching novices programmers. |
A Novel Combination of Features for Assured Practical Application of HMM in Handwritten Character RecognitionBinod Prasad, Bengal College of Engineering and Technology, IndiaAbstractRecognition rate of handwritten character is still limited around 90 percent due to the presence of large variation of shape, scale and format in hand written characters. A sophisticated hand written character recognition system demands a better feature extraction technique that would take care of such variation of hand writing. Again, It has been found that the limited recognition rate of 1D HMM Character Recognition System and requirement of respectively larger no. of samples for construction of 2D HMM system, HMM based character recognition system is lagging behind other systems so far assured practical application is concerned. The purpose of this research work is to enhance the recognition rate of an off-line handwritten Character Recognition System using 1D HMM with an ideal combination of features followed by decreased algorithm complexity, so that it could be implemented practically with full confidence. The overall efficiency of the system has been found to be 94.63%. Keywords- Character recognition, Hidden Markov Model, Zone theory, Projection features, Curvature features, Baum- Welch algorithm. |
Block Updates on Truncated ULV DecompositionJesse L. Barlow1, Ebru Aydogan2 and Hasan Erbay2, 1Penn State University, United States and 2Kirikkale University, TurkeyAbstractA truncated ULV decomposition (TULV) of an mXn matrix X of rank k is a decomposition of the form X = U1LV1T + E, where U1 and V1 are left orthogonal matrices, L is a k X k non-singular lower triangular matrix and E is an error matrix. Only U1,V1, L, and ||E||F are stored. We propose algorithms for block updating the TULV based upon Block Classical Gram-Schmidt that in [4]. We also use a refinement algorithm that reduces ||E||F, detects rank degeneracy, corrects it and sharpens the approximation. |
Semantic Integration for Automatic Ontology MappingSiham Amrouch1 and Sihem Mostefai2, 1Med Cherif Messadia, Algeria and 2Mentoury university, AlgeriaAbstractIn the last decade, ontologies have played a key technology role for information sharing and agents interoperability in different application domains. In semantic web domain, ontologies are efficiently used to face the great challenge of representing the semantics of data, in order to bring the actual web to its full power and hence, achieve its objective. However, using ontologies as common and shared vocabularies requires a certain degree of interoperability between them. To confront this requirement, mapping ontologies is a solution that is not to be avoided. In deed, ontology mapping build a meta layer that allows different applications and information systems to access and share their informations, of course, after resolving the different forms of syntactic, semantic and lexical mismatches. In the contribution presented in this paper, we have integrated the semantic aspect based on an external lexical resource, wordNet, to design a new algorithm for fully automatic ontology mapping. This fully automatic character features the main difference of our contribution with regards to the most of the existing semi-automatic algorithms of ontology mapping, such as Chimaera, Prompt, Onion, Glue, etc. To better enhance the performances of our algorithm, the mapping discovery stage is based on the combination of two sub-modules. The former analysis the concept's names and the later analysis their properties. Each one of these two sub-modules is it self based on the combination of lexical and semantic similarity measures. |
Complementary Problems For Subset-Sum And Change Making ProblemsAsli Guler1 and Urfat Nuriyev2, 1YASAR UNV., Turkey and 2EGE UNV., TurkeyAbstractIn this study, Change Making Problem (CMP) and Subset-Sum Prob-lem (SSP), which can arise, in practice, in some classes of one dimensional car-go loading and cutting stock problems, are researched. These problems are of-ten used in computer science, as well. CMP and SSP are NP-hard problems and these problems can be seen as types of the knapsack problem in some ways. The complementary problems for the change making problem and the subset-sum problem are defined in this study, and it is aimed to examine the CMP and SSP by means of the complementary problems. |
A New Method for Estimation of Missing Data Based on Sampling Methods for Data MiningRima Houari1, Ahcene Bounceur2, Tahar Kechadi3 and Reinhardt Euler2, 1University of Abderrahmane Mira Bejaia, Algeria, 2Lab-STICC, France and 3University College Dublin, IrelandAbstractToday we collect large amounts of data and we receive more than we can handle, the accumulated data are often raw and far from being of good quality they contain Missing Values and noise. The presence of Missing Values in data are major disadvantages for most Datamining algorithms. Intuitively, the pertinent information is embedded in many attributes and its extraction is only possible if the original data are cleaned and pre-treated. In this paper we propose a new technique for preprocessing data that aims to estimate the Missing Values, in order to obtain representative Samples of good quality, and also to assure that the information extracted is more safe and reliable. |
Application of distributed Datamining techniques for Email ForensicsSalhi Dhai Eddine1, Tari Abdel Kamel1 and Kechadi Mohand Tahar2, 1Universite de Bejaia, Algeria and 2University College Dublin, IrelandAbstractIn our days, the emails have become a daily means of communication most popular accessible via Internet. Accounts in our reception we receive emails gangs (forensics), but we do not know. From there, the idea of building a system of automatic check is com- ing a necessity. To this end, in this paper we present a new method of treatment of emails to extract the bad emails in a mail server or an inbox of a user, using distributed data mining techniques. This study will reduce the risk of email users being hacked and even gives out to mail server administrators to detect bad emails and put the servers more secure. |
An Intelligent System for Image FusionAshok Kumar, Amruta Shelar, and Varala Naidu, University of Louisiana at Lafayette, USAAbstractThe recent years have seen an increasing interest in developing algorithms for image fusion and several algorithms have been proposed in the literature. However, a process for assessing fusion algorithms and coming up with the best solution for a given set of images has not been sufficiently explored so far. In this paper, a system is proposed that performs intelligent decision making in image fusion. The system uses the concepts of adaptive learning and inherent knowledge to present the best fusion solution for a given set of images. By automating the selection process, the system can analyze and exhibit intrinsic details of the images and adapt this knowledge to provide better solutions for varying types of images. |
On the Nearest Neighbor Algorithms for the Traveling Salesman ProblemGozde KIZILATEŞ and Fidan NURİYEVA, Ege University, TurkeyAbstractIn this study, a modification of the nearest neighbor algorithm (NND) for the traveling salesman problem (TSP) is researched. NN and NND algorithms are applied to different instances starting with each of the vertices, then the performance of the algorithm according to each vertex is examined. NNDG algorithm which is a hybrid of NND algorithm and Greedy algorithm is proposed considering experimental results, and it is tested on different library instances. |
A Design of Low Sampling-Rate Wireless Algorithm for Intelligent Robots Based on Compressed Sensing TheoryNarges Balouchestani Asli, University of Toronto, CanadaAbstractArtificial Intelligence (AI) is the area of Information Communication Technology (ICT) science focusing on creating Intelligent Robots (IRs)that engage on behaviors that humans consider intelligent. Today, with employing the Compressed Sensing (CS) theory as a new and low sampling-rate approach the dream of IRs is becoming a reality. The IR suffers of some important problems such as limited processing capability, low storage capacity, limited energy, and transmission capacity. The CS theory promises to be a key element in wireless communication systems, including wireless robots for next-generation. Aforementioned highlights the need and advantage of wireless communication based on CS theory with low sampling-rate and low power consumption. With this in mind, Compressed Sensing (CS) procedure as a new sampling approach and the collaboration of AI framework is used to provide a robust lowsampling algorithm for IRs with high probability and enough accuracy. Advanced IR systems based on our approach will be able to deliver web-controlling, intelligent automation, intelligent healthcare systems in many industrial, civilian, and medical applications. Our simulation results on a unicycle robot and the collaboration of a True Time Simulation (TTS) toolbox show the sampling-rate can reduce to 25% of Nyquist-rate without sacrificing performance and with further decreasing the sampling-rate, the performance is gradually reduced. Our simulation results also show a good reduction level for power consumption in order in the increase, the life time of IR systems. |
A Solution to the Problem of Congestion in 2-Dimensional Broadcast-Based Multiprocessor ArchitecturesCigdem Inan ACI and Mehmet Fatih AKAY, Cukurova University, TurkeyAbstractNetwork dimension is one of the most important design issues of interconnection networks. Although advantages and common usage of 2-D interconnection networks, congestion is still a problem in these interconnection networks. In this paper, we propose a congestion control algorithm to improve 64-node, 2-D Simultaneous Optical Multiprocessor Exchange Bus (SOME-Bus) network performance. We utilized only input and only output algorithm parameters. The congestion control algorithm is tested via simulation under uniform (UN) and hot region (HR) synthetic traffic patterns on non-independent traffic source. We concentrate on two performance metrics: average network response time and average processor utilization. Compared with the case where the algorithm is not applied, the proposed algorithm is able to decrease the average network response time by 16.20% to 26.51% and increase average processor utilization by 5.19% to 12.50%. |
Full-Wave Analysis of High-Tc Superconductor Triangular Patch AntennaOuarda Barkat, University of Constantine 1, AlgeriaAbstractThe design of superconducting triangular microstrip patches implanted in anisotropic substrate is presented in this research. Previous authors proposed several analytical expressions for simple antennas to obtain the quality factor Q, which is of practical importance because of its relationship to the antenna bandwidth. This paper presents a simple method to calculate the Q using full wave method. The moment method in the spectral domain is used to study the scattering properties of superconducting equilateral triangular antennas. In this method the electric field integral equation for a current element on a grounded dielectric slab of infinite extent is developed by basis functions. An improved analytical model is presented taking into account anisotropic substrate, and superconducting material for the triangular patch. The obtained results are compared with previously published work, they were in good agreement. |
Integration Islamic Banking System Based on Service Oriented Architecture and Enterprise Service BusAko A. Jaafar and Dayang N.A. Jawawi, Universiti Technologi Malaysia, MalaysiaAbstractIntegration is the most important part for the complex system like Islamic Banking System (IBS). Most of Islamic Banking System have comes with the two different parts financial and deposit part which made IBS more complex for integration than other type of banking system. Despite current technologies ability to integrate different application together, it also makes integrating IBS more complicated due to poor reusability and loosely coupling in the present technologies and approaches like traditional Enterprise Application Integration (EAI) or Point to Point Web Services (P2PWS). This paper present the concept of Service Oriented Architecture (SOA) based application integration, by proposing an application integration framework for IBS using Enterprise Service Bus (ESB) and Business Process Execution Language (BPEL). The output of this paper shows applying ESB/BPEL in IB which increase the reusability and loosely coupled of IB services. |
Speeding up the Web Crawling Process on a Multi-Core Processor SystemHussein Al-Bahadili1 and Hamzah Qtishat2, 1Petra University, Jordan and 2Middle-East University, JordanAbstractWeb crawlers are important components of the Web search engine. They demand large amount of hardware resources (CPU and memory) to crawl data from the rapidly growing and changing Web. The crawling process should be performed continuously to maintain up-to-date crawled data. This paper develops a new approach to speed up the crawling process on a multi-core processor through virtualization. In this approach, the multi-core processor is divided into a number of virtual-machines (VMs) that can run in parallel (concurrently) performing different crawling tasks on different initial data. In particular, it presents a description, implementation, and evaluation of a VM-based distributed Web crawler. The speedup factors achieved by the VM-based crawler over no virtualization crawler are estimated for crawling various numbers of documents. Also, the effect of the number of VMs on the speedup factor is investigated. For example, on an Intel® Core™ i5-2300 CPU @2.80 GHz and 8 GB memory, a speedup factor of ~1.48 is achieved when crawling 70000 documents on 3 and 4 VMs. |
Linear Time Complexity Sort AlgorithmMohammad Reza Ghaeini and Mohammad Reza Mirzababaei, Amirkabir University of Technology Tehran, IranAbstractIn The field of Computer Science and Mathematics, sorting algorithms put elements of a list in a certain order, ascending or descending. Sorting is perhaps the most widely studied problem in computer science and is frequently used as a benchmark of a system's performance. In this paper we present an improved stable sort algorithm based on bucket sort algorithm that statistically and in average does sorting operation with linear time complexity (O(n)). Our algorithm is 50 percent faster than other comparison sorting algorithms e.g. Quick sort and Merge sort. |
Rel-RASCONet:Co-Training for Networked Data Using Relevant and Random SubspacesZehra Cataltepe and Kadriye Baglioglu, Istanbul Technical University, TurkeyAbstractIn this paper, we introduce the Rel-RASCONet (Relevant Random Subspace Co-training for Networked Data) which is a Co-training algorithm for transductive classication of networked data. In the algo- rithm, rst of all we construct enriched features for networked data, which consist of the node features, aggregated neighbor features and ag- gregation of dierent element-wise combinations (such as logical COUNT or OR) of node and neighbor features. Then, we select relevant and ran- dom subsets of features and train classiers on them using the training data. In order to label the test data, as in the traditional Co-training algorithm, we let each classier to label the test instances for which it is most condent and add those instances with the predicted labels to training data. We retrain classiers and repeat until all the test data is labeled. On two networked datasets, we show that Rel-RASCONet, us- ing Co-training on test data, with relevant and random subspaces of fea- tures results in better performance than using just the node features (i.e. Content Only classication). We also show that instead of Co-training classiers until all test data are labeled, early stopping of Co-training could produce better results for Rel-RASCONet. |
Path Guided Abstraction Refinement for Safety Program VerificationAleb Nassima and Kechid Samir, University of sciences and Technolgies, AlgeriaAbstractThis paper presents a new compositional approach for safety verifica-tion of C programs. A program is represented by a sequence of assignments and a sequence of guarded blocks. Abstraction consists to abstract the program in a set of blocks relevant to the erroneous location (EL). As in the CEGAR paradigm, the abstracted model is used to prove or disprove the property. This checking is performed for each block backwardly, using Weakest Preconditions to generate a formula which Satisfiability is checked. If the abstraction is too coarse to allow deciding on the Satisfiability osf the formula, then a path-guided refinement is performed. Our technique allows handling programs containing function calls and pointers. All aspects described in this paper are illustrated by clarifying examples. |
Automatically Language Patterns Elicitation from Biomedical LiteratureSeyed Ziaeddin Alborzi, Nanyang Technological University, IranAbstractThe amount of research articles being published over the years has been overwhelming and number continues to rise with each day. This rapid growth combined with the unstructured nature of text written in natural languages has created the need to develop tools and methods that aid the process of information extraction, making it more accessible and utilizable. In this work, we present an approach for language pattern acquisition from the biomedical literature. In our method, all possible patterns are generated (candidates' enumeration), and those patterns which have a match in the training corpus are selected. Equipped with genes and proteins names glossaries plus keywords database, we achieved a recall rate of 52.2% with precision of 40.9%, identifying 321 gene ontology terms. |
Important Dates
Paper Submission Deadline:
11 February, 2013
Paper Status Notification:
15 March, 2013
Camera-ready Due:
30 March, 2013
Conference :
June 7~9, 2013
Past event
CCSEIT 2011
CCSEIT 2012
Technically Sponsored by
Courtesy