Security of Virtual Machines:
Virtual machines (VM) are rapidly replacing physical machine infrastructures for their abilities to emulate hardware environments, share hardware resources, and utilize a variety of operating systems (OS). VMs provide a better security model than traditional machines by providing an additional layer of hardware abstraction and isolation, effective external monitoring and recording, and on-demand access. However, this new model requires adaptation of existing security methods, which cannot currently keep up with the ease of creating new VMs with a variety of configurations and lifecycles. Attackers have successfully compromised VM infrastructures, allowing them to access other VMs on the same system and even the host. Fortunately, these security concerns are being addressed and users can prevent most intrusions by applying traditional security measures to each VM.
Secure Mobile Ad Hoc Routing:
Mobile Ad Hoc Networks (MANETs) are a promising area of application for emergent computing techniques. The set of applications for MANETs is diverse, ranging from small, static networks that are constrained by power sources, to large-scale, mobile, highly dynamic networks. The design of network protocols for these networks is a complex issue. Regardless of the application, MANETs need efficient distributed algorithms to determine network organization, link scheduling, and routing. Mobile devices in these environments exhibit various levels of processing power, mobility and connectivity, but existing approaches do not consider these characteristics.
Formal Models and Verification of Security Protocols:
Security protocols are vulnerable to attacks even with perfect cryptographic assumptions. The problem of protocol verification is undecidable. Manual verification of protocols is tough and may not bring to light all the flaws. Formal modeling and verification helps to overcome this difficuly and provides a platform for automatic verification. Many formal models have been proposed in the literature.
Program Analysis and Code Optimization:
Compilers is an area with lot of on-going research. Code optimization is an important phase of a compiler. Overall aim of this phase is to reduce the time taken or space taken or both subjected to various constraints. At the compiler level, relatively low-level semantic transformations are the possible one; but, they find lot of use in eliminating many of the inefficiencies introduced as part of translation of a program from high level form to the low level machine code form.
Image segmentation is a process which identifies distinct regions that are meaningful which in turn find a solution to an application level task. It is widely accepted by experts on this field that there is no single segmentation algorithm that can solve any segmentation problem in any image under any illumination condition. Many researchers have studied and contributed much effort on this subject. However the problem of segmentation still remains a great challenge for image analysis and computer vision communities and many of the computational issues of segmentation remain unsolved.
Analysis of medical images is essential in modern medicine. With the ever increasing amount of patient data, new challenges and opportunities arise for different phases of the clinical routine, such as diagnosis, treatment and monitoring. Early diagnosis provides the opportunity to deliver medical interventions to either prevent the development of the disease or prevent the progression of the condition and emergence of symptoms.
Disorders, such as cancers and Alzheimer’s disease, are major contributors to increased health expenditure and are well suited to diagnosis through imaging. Earlier detection of such diseases is expected to realise significant economic and social benefits. In particular, medical imaging can provide the quantitative tools to help industry develop cheaper and more efficient screening tests.
Data Mining and Bio-Informatics:
In recent years, rapid developments in genomics and proteomics have generated a large amount of biological data. Drawing conclusions from these data requires sophisticated computational analyses. Bioinformatics, or computational biology, is the interdisciplinary science of interpreting biological data using information technology and computer science. The importance of this new field of inquiry will grow as we continue to generate and integrate large quantities of genomic, proteomic, and other data. A particular active area of research in bioinformatics is the application and development of data mining techniques to solve biological problems. Analyzing large biological data sets requires making sense of the data by inferring structure or generalizations from the data. Examples of this type of analysis include protein structure prediction, gene classification, cancer classification based on microarray data, clustering of gene expression data, statistical modeling of protein-protein interaction, etc. Therefore, we see a great potential to increase the interaction between data mining and bioinformatics.
Graph theory is a flourishing discipline containing a body of beautiful and powerful theorems of wide applicability. Its explosive growth in recent years is mainly due to its role as an essential structure underpinning modern applied mathematics – computer science, combinatorial optimization, and operations research in particular – but also to its increasing application in the more applied sciences. The versatility of graphs makes them indispensable tools in the design and analysis of communication networks, image segmentation, clustering, image capturing, for instance.
Messaging in Wireless Mobile Networks:
Resource conservation is important in the constrained wireless networking environment. Wireless mobile devices, especially low cost devices are inhibited by the limited resources such as battery power, screen size, input, memory and processors. The relevance of low cost wireless mobile devices in penetrating to the third world market demands for a cost effective messaging format that fits in the constrained wireless environment. Reducing the bytes needed for messaging is a crucial step for resource conservation. Such a format is to be enhanced by adding security measures in order to ensure a trusted transmission of messages.
Biometrics is measurement of human physiological (such as iris, face, fingerprint, hand, etc.,) and behavioral (such as gait, keystroke, etc.,) traits. The presence of these traits and their uniqueness nature in individuals make them suitable for personal recognition using computers. This is an emerging technology having a lot of research challenges ranging from signal capturing to applications. The research challenges/areas in biometrics are broadly classified into
- Biometric signal acquisition & processing
- Recognition accuracy & robustness enhancement
- Algorithm development (feature selection, fusion, matching and classification)
- Testing & development of standards
- Multimodal biometrics
- Application development
Multi Agent Systems:
In artificial intelligence research, agent-based systems technology has been hailed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated computer programs that act autonomously on behalf of their users, across open and distributed environments, to solve a growing number of complex problems. Increasingly, however, applications require multiple agents that can work together. Multiagent systems are a new paradigm for understanding and building distributed systems, where it is assumed that the computational components are autonomous: able to control their own behaviour in the furtherance of their own goals.