Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
ICACC-2012
Special Issue on Computing and Communication June 2012
GUEST EDITORS
Dr. Himanshu Aggarwal, Punjabi University, Patiala
Dr. Rinkle Aggarwal, Thapar University, Patiala
Sr. No.
1 / 45
ICACC-2012
Title
Authors
1
Review of Apriori Algorithm for finding Association Rules
Supreet Singh
Harpreet Singh
2
A tree based algorithm for web page change detection system
2 / 45
ICACC-2012
Srishti Goel
Rinkle Aggarwal
3
A Token String Based Algorithm for Data Extraction in Web Usage Mining
Surbhi Anand
Rinkle Aggarwal
4
Cache Oblivious Algorithm
3 / 45
ICACC-2012
Upasna Sharma
Himanshu Aggrawal
5
N-Gram based Language Identification for written text
Navdeep Kaur
Mandeep Kaur
Nishi Sharma
6
4 / 45
ICACC-2012
A Neural Network Implementing Back Propagation Sensing Jatropha’s Maturity Level
Kestina Rai
Maitreyee Dutta
Sunil Aggarwal
7
Implementation of low interaction honeypot to
Gurdip Kaur
Gurpal Singh
5 / 45
ICACC-2012
8
Policy In Cloud Computing Performance Analysis Of Service Broker And Load Balancing
Mandeep Devgan
Kanwalvir Singh Dhindsa
9
Improvements in CINR and RSSI WiMAX Technology
Rishu Verma
Sarabjot Singh
6 / 45
ICACC-2012
10
Advancement in Network Security Using IPSec
Mohit kumar
parmpreet singh
priya joshi
reetika aggarwal
ashish jalota
11
Building a Successful CRM Using Data Mining Techniques
7 / 45
ICACC-2012
Gaurav Gupta
Himanshu Aggrawal
12
A Secure Authentication System- Using Enhanced Serpent Technique
Raman Kumar
Manjinder S. Kahlon
Amit Kalra
13
Concept Mining From Data Base To Find Experts Topics
8 / 45
ICACC-2012
Ankita Dwivedi
vridhi madaan
archana singh
14
A Comprehensive Study of classification of NoSQL Databases
Karamjit Cheema
Rinkle Aggarwal
15
9 / 45
ICACC-2012
Study of DDoS attacks using DETER Testbed
Daljeet Kaur
Monika Sachdeva
Krishan Saluja
16
Virtualization With Cloud Computing
Sagrika Chanday
Sandeep Sharma
10 / 45
ICACC-2012
Gursewak Singh Brar
Sunny Chanday
17
Comparison of different wavelets for Speckle denoising by applying Hard and Soft thresholding
Sandeep Kaur
Ranjit Singh
18
Clone Cost Effects?
11 / 45
ICACC-2012
Amanpreet Kaur
Dhavleesh Rattan
19
"An Analysis of the Engineering Students’ Attitude towards Technology Enhanced Language Learning
Gurleen Ahluwalia
20
Design Of Reconfigurable Dual-Beam Concentric Ring Array Antenna Using Biogeography Based Optimization
Urvinder Singh
12 / 45
ICACC-2012
Gurkamal Kaur
21
Implementation of Pattern Recognition System using NI Vision
R.S Uppal
Livjeet Kaur
22
Segmentation of Prostate Boundary from Ultrasound Images using Ant Colony Optimization
13 / 45
ICACC-2012
Vikas Wasson
Baljit Singh
23
Aspect Oriented Software Development- A Framework for Software Reusability and Maintainability
Mandeep Singh
Satwinder Singh
24
Improved Approach for Semantic Web Search Engines and Automatic Discovery of Personal Name Aliases from the Web
14 / 45
ICACC-2012
Ritu Demla
Kanwalvir S. Dhindsa
Vishal Khatri
25
Performance Analysis of HBase on HDFS
Jatinder Kaur
Gurleen Kaur Dhaliwal
Gagandip Singh
15 / 45
ICACC-2012
26
Security of LFSR based Stream Ciphers using Genetic Algorithm
Rupinder Minhas
Jagjit Minhas
Lovejeet Goraya
REVIEW OF APRIORI ALGORITHM FOR FINDING ASSOCIATION RULES
16 / 45
ICACC-2012
Supreet Singh
Munish Saini, Harpreet Singh
Abstract—Association rule mining is the process of finding interesting and useful relationshipbetween various data elements of database. As the size of database is growing so rapidly,efficient methods are required for finding association rules effectively. This paper presents areview of classical Apriori Algorithm and recent work done by various researchers to improvethe performance of Apriori Algorithm.
Keywords— Apriori algorithm; Association rule; Frequent itemsets.
A TREE BASED ALGORITHM FOR WEB PAGE CHANGE DETECTION SYSTEM
17 / 45
ICACC-2012
Srishti Goel, Rinkle Rani Aggarwal
Rinkle Rani Aggarwal
Srishti Goel
Abstract: People are using internet actively for exchange of information across the worldresulting in uploading of information on web pages and updating of new web pages veryfrequently. The contents of web page changes continuously & rapidly. Hence it becomes verydifficult to observe the changes made in web pages and retrieve the original web pages .Forefficient retrieval and monitoring the changes made in web pages and compare the differencebetween refreshed page and old page efficiently that too in minimum browsing time, an effectivemonitoring system for the web page change detection based on user profile is needed. The webpage change detection system can be implemented by using various Tools or Algorithms. In thispaper, we will explain the algorithms to detect the changes in content and structure of a webpage.
Keyword: Web page change detection, Signature of node, Tree matching, XML, Changemonitoring
18 / 45
ICACC-2012
A TOKEN STRING BASED ALGORITHM FOR DATA EXTRACTION IN WEB USAGEMINING
Surbhi Anand, Rinkle Rani Aggarwal
Abstract: World Wide Web is a massive repository of web pages which provides abundance ofinformation for the Internet users due to which the size of web has become tremendously large.Web Usage Mining applies mining techniques in log data to extract the behaviour of userswhich is used in various applications like personalization, adaptive web sites, customer profiling,creating attractive web sites etc. A Web Usage Mining process consists of three phases: datapreprocessing, patterns discovery and patterns analysis. Data preprocessing tasks must beperformed prior to the application of mining algorithms to convert the raw data collected fromserver logs into the data abstraction. The appropriate analysis of web log file proves beneficialto manage the websites effectively from the administrative and users’ prospective. Datapreprocessing results also have a direct impact on the later phases. Therefore, thepreprocessing of web log file plays an essential part in the web usage mining process. Thispaper emphasizes on the mining of web access logs and makes some exploration in the field ofdata preprocessing.
Keywords: World Wide Web; Preprocessing; Data Cleaning; Web Usage Mining; Web Serverlogs.
19 / 45
ICACC-2012
CACHE OBLIVIOUS ALGORITHMS
Upasna Sharma Himanshu Aggrawal
Abstract: This paper presents the approaches which help us to analyze the running time of analgorithm on a computer with a memory hierarchy with limited associativity, in terms of variouscache parameters. The performance of an algorithm when implemented depends on manyparameters: its theoretical asymptotic performance, the programming language chosen, choiceof data structures, the configuration of the target machine. The cache efficient and disk efficientalgorithms and data structure is the notion of cache oblivious, introduced by Frigo, Lieserson,Prokop, and Ramachandran in 1999. In this paper, we study cache resources efficiently. Thehierarchy of memory models in modern computers.
.
Keywords: Cache Aware, Cache Oblivious Models, and Cache oblivious Algorithm.
20 / 45
ICACC-2012
N-GRAM BASED LANGUAGE IDENTIFICATION FOR WRITTEN TEXT
Navdeep Kaur, Mandeep Kaur, Nishi Sharma
ABSTRACT
Language identification technology is widely used in the domains of machine learning and textmining. Many researchers have achieved excellent results on a few selected Europeanlanguages. However, the majority of African and Asian languages remain untested. The primaryobjective of this paper is to evaluate the performance of our new n-gram based languageidentification algorithm on languages written with Arabic script. In this paper we have worked forautomatically identifying language for Arabic script documents. Our approach to the languageidentification problem is based on the n-gram analysis method. This technique achieves goodperformance, even with relatively small training sets. It works for three languages namely Urdu,
21 / 45
ICACC-2012
Arabic and Shahmukhi. Then the system classifies the language of the given document. Anoverall accuracy of about 95.33% is achieved with test documents.
KEYWORD: Language Identification; multilingualism; classification; statistical system; n-gram.
A Neural Network Implementing Back Propagation Sensing Jatropha’s Maturity Level
Kestina Rai Maitreyee Dutta Sunil Aggarwal
Abstract: Jatropha curcas ( Sanskrit : danti pratyanshrani) is a species of flowering plant in thespurge family, In 2007 Goldman Sachs cited Jatropha curcas as one of the best candidates forfuture biodiesel production. It is resistant to drought and pests, and produces seeds containing27-40% oil, averaging 34.4%. The remaining press cake of jatropha seeds after oil extractioncould also be considered for energy production. Traditionally, human experts perform the
22 / 45
ICACC-2012
identification of Jatropha curcas. Its quality depends on type and size of defects as well as skincolor and fruit size. Then a Grading System of Jatropha (GSJ) by using color histogram methodwas developed to distinguish the level of ripeness of the fruits based on the color intensity.Although this automated approach was better than the human expert identification but it onlydeals with one aspect of the fruit, that is, color. In this paper we propose an artificial neuralnetwork approach to build an expert system to measure the quality of the fruit based not only onthe color intensity but also on other features of the fruit like fruit size and texture of the fruit, etc.Because this type of a system can learn from examples like humans and therefore can givebetter results.
Keywords: Artificial Neural Network, Back Propagation Network, Feedforward ANN, PatternRecognition.
IMPLEMENTATION OF LOW INTERACTION HONEYPOT TO IMPROVE THE NETWORKSECURITY
Gurdip Kaur
23 / 45
ICACC-2012
Dr. Gurpal Singh
Jatinder Singh Saini
Abstract: Honeypots are increasingly used to provide early warning of potential intruders,identify flaws in security strategies, and improve an organization's overall security awareness.Honeypots cansimulate a variety of internal and external devices, including Web servers, mailservers, database servers, application servers, and even firewalls. [6] Among the universe ofsecurity tools that have been developed to protect our networks, as Firewalls, IDS (IntrusionDetection Systems), IPS (Intrusion Prevention Systems), etc., there is a relative new kind ofsecurity tool called Honeypot. Spitzner, L. (2003) defines a Honeypot as follows: “A Honeypot isa security resource whose value lies in being probed, attacked, or compromised”. This securityresource is so flexible, that the organization can use it to detect intrusion, unethical behavior ofemployees, delay attack to their networks, forensics, gather information of attacks to know howthey were made, virus researching, etc. Honeypots are a highly flexible security tool withdifferent applications for security. They have multiple uses, such as prevention, detection, orinformation gathering. Honeypots all share the same concept, a security resource that shouldnot have any production or authorized activity. In other words, deployment of honeypots in anetwork should not affect critical network services and applications. A honeypot is a securityresource whose value lies in being probed, attacked, or compromised.
24 / 45
ICACC-2012
Keywords: Intrusion Detection System; low interaction honeypot; honeyd; network security;unused IP; arpd.
PERFORMANCE ANALYSIS OF SERVICE BROKER AND LOAD BALANCING POLICY INCLOUD COMPUTING
Mandeep Devgan KVS Dhindsa Mandeep Singh
Abstract: Cloud computing is an on demand service in which shared resources, information,software and other devices are provided according to the clients requirement at specific time.The performance of an application gets affected with service broker policy and the loadbalancing policy used across different virtual machines in a single Data Center (DC). In thispaper using different combinations of various algorithms of load balancing and service brokerpolicy we analyzed the effect on performance of a cloud based application. In this paper wealso explained two policies named service broker and load balancing in single DC, we used aCloudSim based analysis and modeling tool named ’CloudAnalyst’.
25 / 45
ICACC-2012
Advancement in Network Security Using IPSec
Mohit Kumar Parmpreet Singh Priya Joshi Reetika Aggarwal Ashish Jalota
IPsec (IP security) is a standardized framework for securing Internet Protocol (IP)communications by encrypting and/or authenticating each IP packet in a data stream. There aretwo modes of IPsec operation: transport mode and tunnel mode. In transport mode only thepayload (message) of the IP packet is encrypted. It is fully-routable since the IP header is sentas plain text; however, it can not cross NAT interfaces, as this will invalidate its hash value. Intunnel mode, the entire IP packet is encrypted. It must then be encapsulated into a new IPpacket for routing to work Internet Protocol security (IPsec) uses cryptographic security servicesto protect communications over Internet Protocol (IP) networks. IPsec supports network-levelpeer authentication, data origin authentication, data integrity, data confidentiality (encryption),and replay protection. The Microsoft implementation of IPsec is based on Internet EngineeringTask Force (IETF) standards.
26 / 45
ICACC-2012
Building a Successful CRM Using Data Mining Techniques
Gaurav Gupta
Himanshu Aggarwal
Abstract
Customer Relationship Management (CRM) refers to the methodologies and tools that helpbusinesses manage customer relationships in an organized way. Advancements in technologyhave made relationship marketing a reality in recent years. Technologies such as datawarehousing, data mining, and campaign management software have made customerrelationship management a new area where firms can gain a competitive advantage.Particularly through data mining organizations can identify valuable customers, predict futurebehaviors, and enable firms to make proactive, knowledge-driven decisions. Data Mining is theprocess that uses a variety of data analysis and modeling techniques to discover patterns andrelationships in data that may be used to make accurate predictions. Various techniques existamong data mining software, each with their own advantages and challenges for different typesof applications. It can help you to select the right prospects on whom to focus. Variableselection and class distribution on the effects the performance in building a successful CRM.
Keywords - CRM, Marketing, Relationship, Data Mining, Variable Selection, Sampling and
27 / 45
ICACC-2012
Ensemble.
An Efficient and Secure Authentication System- Using Enhanced Serpent Technique
Raman Kumar
ABSTRACT
With the upcoming technologies available for hacking, there is a need to provide users with asecure environment that protect their resources against unauthorized access by enforcingcontrol mechanisms. To counteract the increasing threat, enhanced serpent technique has beenintroduced. It generally encapsulates the enhanced serpent technique and provides client acompletely unique and secured authentication tool to work on. This paper however proposes ahypothesis regarding the use of enhanced serpent technique and is a comprehensive study onthe subject of using enhanced serpent technique (EST). This forms the basis for a securecommunication between the communicating entities. Several password authentication protocolshave been introduced each claiming to withstand to the several attacks, including replay,password file compromise, denial of service, etc. We introduced a new technique through whichthe people can communicate to each other securely using EST. In this technique, very smallsize key size of 128, 192 and 256 bits are needed for both the sender and receiver and thenthey can have a very secure communication between them, so the server which is providing thecommunication between sender and receiver will also do not have any knowledge about the
28 / 45
ICACC-2012
way to decrypt the text. Therefore, the proposed scheme is secure and efficient againstnotorious conspiracy attacks.
Keywords
Enhanced Serpent Technique (EST), Random, Attacks Security Threats, CryptographicTechniques and Information Security.
CONCEPT MINING FROM DATA BASE TO FIND EXPERTS TOPICS
Ankita Dwivedi
Vridhi Madaan Batra
29 / 45
ICACC-2012
Archana Singh
Abstract: This paper explores the concept extraction task which is an important step in naturallanguage processing. Now a day’s need of automation in locating important terminology amonga specific domain becomes crucial. The manual process for finding the concepts and theirfurther refining requires extensive labor and is time consuming. Automation of the entireprocess reduces manual involvement in concept extraction. This journal mainly focuses on theapplication of finding domain experts by locating interesting topics and assigns those to theexperts. A profile of experts is being maintained which gives the information about the expertiseknowledge. After locating the concepts, manual verification is required as all the results are notrelevant and experts of domain validate them.
Keywords
Ontology, Concept, Term, Part Of Speech, n-gram.
A Comprehensive Study of classification of NoSQL Database
Abstract : In this paper, we examine a number of “NoSQL” data stores designed to scale simpleOLTP style application loads over many servers. Originally motivated by Web 2.0 applications,
30 / 45
ICACC-2012
these systems are designed to scale to thousands or millions of users doing updates as well asreads, in contrast to traditional DBMSs and data warehouses. The NoSQL movement beganearly 2009 and is growing rapidly. Different NoSQL databases take different approaches. Whatthey have in common is that they’re not relational. Their primary advantage is that, unlikerelational databases, they handle unstructured data such as word-processing files, e-mail,multimedia, and social media efficiently.
Keywords : NoSQL; Key-value; Column-oriented; Document; Database; Big Data.
Study of DDoS attacks using DETER Testbed
Daljeet Kaur
Monika Sachdeva
Krishan Kumar
ABSTRACT
31 / 45
ICACC-2012
In present era, the world is highly dependent on the Internet and it is considered as maininfrastructure of the global information society. Therefore, the Availability of information andservices is very critical for the socio-economic growth of the society. However, the inherentvulnerabilities of the Internet architecture provide opportunities for a lot of attacks on itsinfrastructure and services. Distributed denial-of-service (DDoS) attack is one such kind ofattack, which poses an immense threat to the availability of the Internet. These attacks not onlycongest a Server by their attack, but also affect the performance of other Servers on the entirenetwork also, which are connected to Backbone Link directly or indirectly. To analyze the effectof DDoS attack on FTP services, repeated research in cyber security that is vital to the scientificadvancement of the field is required. To meet this requirement, the cyber-DEfense TechnologyExperimental Research (DETER) testbed has been developed. In this paper, we have createddumb-bell topology and generated background traffic as FTP traffic. Different types of DDoSattacks are also launched along with FTP traffic by using attack tools available in DETERtestbed. Throughput of FTP server is analyzed with and without DDoS attacks.
Keywords
DDoS, availability, vulnerability, confidentiality, botnet.
VIRTUALISATION WITH CLOUD COMPUTING
(Review Paper)
32 / 45
ICACC-2012
Ms. Sagrika
Dr.Sandeep Sharma
Dr.G.S. Brar
Mr. Sunny Chanday
ABSTARCT
“Cloud” computing stands on decades of research in virtualization, distributed computing, utilitycomputing, and, more recently, networking, web and software services. It implies aservice-oriented architecture, reduced information technology overhead for the end-user, greatflexibility, reduced total cost of ownership, on demand services and many other things.Virtualization is quickly becoming a vital technology across all parts of the IT environmentssince last couple of years. Virtualization is now in use across nearly all enterprises, and futureplans to move some applications to Cloud Computing, because cloud computing will justcompound the problems of virtualization or we can say both technologies are catalyzing eachother. The similarities in both strategies are like both helps to reduce the size and control theexpansion of data center to reduce the cost of hardware, power and cooling, space,management and disaster recovery but their initial and ongoing costs are differ. This paperfocuses on the concept of “cloud” computing with the virtualization technology. Our study alsoconcentrates on, if virtualization really helps to make the cloud better or both the technologiesare making the work of IT industries complex.
33 / 45
ICACC-2012
Comparison of different wavelets for Speckle denoising by applying Hard and Softthresholding
Sandeep Kaur
Ranjit Singh
Abstract— An image is often corrupted by noise in its acquisition and transmission. Imagedenoising is used to remove the noise while retaining as much as possible the important imagefeatures. In this paper, denoising of speckle noise is discussed with discrete wavelet transform(DWT) using different wavelets such as haar, daubechies, symlets etc. The two very basicthresholding techniques are applied on each of them and the comparison is done usingparameters such as PSNR and MSE.
Keywords- denoising, speckle noise,Discrete wavelet transform, Haar wavelet, DaubechiesWavelet, Symlet, Discrete Meyer Wavelet,PSNR,MSE.
34 / 45
ICACC-2012
CLONE COST EFFECTS ?
Er. Amanpreet Kaur Chela
Er. Dhavleesh Rattan
ABSTRACT: The demand of software’s has increased with the development of technology andcommunication systems. With this the maintenance effects which is a vital factor. Thesoftware’s are not identical if we contrast them with past, present and future, due todevelopment of new programming languages and their principles. To improve the quality of anysoftware, maintenance is must. Cloning in source code files makes it difficult to modify. Severalmodels are designed to overcome this problem. This paper presents the extension of analyticalcost model to evaluate the cloning. As the size and the complexity of software increase, it alsobecomes essential to develop high-quality software, cost-effectively within a specified period.This paper presents a study on the cloned code, the large open source systems are used,various other new parameters are added to calculate clone.
Keywords: Source Code, Clone, modifications and fragment
35 / 45
ICACC-2012
AN ANALYSIS OF THE ENGINEERING STUDENTS’ ATTITUDE TOWARDS TECHNOLOGYENHANCED LANGUAGE LEARNING
Dr.Gurleen Ahluwalia
Abstract: This study investigates the students’ perceptions of using technological tools likeblogs, wikis, Interactive software and the researcher’s language learning website as a means tosupplement in-class language learning activities. This study evaluates a language laboratoryprogram in which forty first year engineering students from a college in Punjab were introducedto these tools and instructed to use them for their laboratory work. The data collected revealsthat despite encountering some difficulties, students had an overall positive attitude towardsusing these tools in their learning of English. The students find that learning English throughtechnology is interesting and effective.
Keywords: Blogs, Wikis, Language learning, Technology- Enhanced Language Learning (TELL)
36 / 45
ICACC-2012
Design of Reconfigurable Dual-Beam Concentric Ring Array Antenna usingBiogeography Based Optimization
Urvinder Singh Dr. T.S. Kamal Ms. Gurkamal Kaur
ABSTRACT
This paper describes a method of designing a reconfigurable dual-beam antenna array using anovel Biogeography Based Optimization (BBO) algorithm. The objective is to generate a dualpencil beam patterns with two different pre-defined sidelobe levels from a concentric ring arrayof isotropic antennas. The two patterns differ only in radial phase distribution while sharing acommon radial amplitude distribution. The pattern with lower sidelobe level is obtained byswitching the phase distribution from zero value to the optimum value. The phases andamplitudes for elements in each ring is same but is varied radially. BBO is used to obtainoptimum set of phase and amplitude distribution that will generate dual pencil beam.
37 / 45
ICACC-2012
Keywords
Antenna, optimization; biogeography based optimization; antenna; concentric antenna; antennaarray; reconfigurable antenna; dual-beam antenna
Implementation of Pattern Recognition System using NI Vision
Mohinder Pal Joshi
R.S Uppal
Livjeet Kaur
38 / 45
ICACC-2012
ABSTRACT
The emerging biometric personal identification systems have increased the need for accurateand efficient ways of Pattern recognition. Pattern recognition is the backbone of any biometricidentification system and is widely used these days for the biometric personal identificationsystems [1] [2]. Pattern matching compares the user template with templates from the databaseusing some matching metric. This paper presents a new and simple approach for Patternrecognition based on template matching. Pattern Matching is implemented in NI VisionAssistant and NI LabVIEW.
Keywords
Pattern Recognition, Convolution kernel, Color extraction, Filtering, Normalized CrossCorrelation, Scale and Rotation-Invariant Matching, Pyramidal Matching, NI Vision Assistant, NILabVIEW.
Segmentation of Prostate Boundary from Ultrasound Images using Ant ColonyOptimization
39 / 45
ICACC-2012
(Vikas Wasson)
(Baljit Singh)
Abstract: Prostate Cancer & disease is one of the leading causes of death in adult & elderlyman. For the success of the treatment, it is very important to detect the disease at early stages.Till now, manual contouring is the only reliable method used for this purpose. However manualcontouring is a tedious & very time consuming task. In this paper, an automatic multi-stagealgorithm for Prostate Boundary Detection from ultrasound images is presented. In the firststage, a Sticks Filter is used to enhance the contrast of the image & to reduce the speckle fromit. Next, Initial Contour is determined from this enhanced image. The final step is to determinethe volume of the Prostate by segmenting the Prostate Boundary using Ant ColonyOptimization. In the last section, the performance of the present research is demonstrated bycomparing it with Genetic Algorithms & Manual Outlining.
Keywords: ACO, Prostate, TRUS.
Aspect Oriented Software Development- A Framework for Software Reusability andMaintainability
40 / 45
ICACC-2012
Er. Mandeep Singh
Er. Balwinder Kumar
Prof. Satwinder Singh
Abstract
Aspect -oriented software development (AOSD) is widely used in industries and in researchwork. It provides new abstractions and complexity dimensions for software engineering. As aresult, Aspect oriented software development poses new problems to empirical softwareengineering. So it requires new frameworks which specifically measure the reusability andmaintainability degrees of aspect-oriented systems. This paper presents a framework for aspectoriented software, which is contains two types of components: a suite of metrics and a qualitymodel which are based on the principles and existing metrics of the software engineering. TheAOSD framework has different characteristics and properties, varying control levels anddifferent complexity degrees. So Based on this analysis, the advantages and drawbacks of theAOSD framework components are discussed.
41 / 45
ICACC-2012
Keywords: Aspect-oriented software development, software metrics, a model for quality,empirical software engineering, cohesion and size metrics.
Improved Approach for Semantic Web Search Engines and Automatic Discovery ofPersonal Name Aliases from the Web
Abstract
In this paper we are presenting the new approach for efficient web search engines andextracting the personal name aliases from the web for social networking websites. Given apersonal name, the proposed method first extracts a set of candidate aliases. Second, we rankthe extracted candidates according to the likelihood of a candidate being a correct alias of thegiven name. The novel, automatically extracted lexical pattern-based method is proposed inorder to efficiently extract a large set of candidate aliases from snippets retrieved from a websearch engine. We define numerous ranking scores to evaluate candidate aliases using threeapproaches: lexical pattern frequency, word co-occurrences in an anchor text graph, and pagecounts on the web. To construct a robust alias detection system, we integrate the differentranking scores into a single ranking function using ranking support vector machines. In additionto this, there are many search engines techniques proposed in order answer the user queriesefficiently and effectively, but they are vulnerable in answering intelligent queries from the userdue to the dependence of their results on information available in web pages. The main focus ofthese search engines is solving these queries with close to accurate results in small time usingmuch researched algorithms. However, it shows that such search engines are vulnerable inanswering intelligent queries using this approach. They either show inaccurate results with thisapproach or show accurate but (could be) unreliable results. Thus we proposed layered modelof Semantic Web which provides solution to this problem by providing tools and technologies to
42 / 45
ICACC-2012
enable machine readable semantics in current web contents. For practical work, we build theproposed semantic web search engine and evaluate the proposed method of personal namealiases.
.
Index Terms—Pattern Extraction, Semantic Web Documents, Search Engine, Names aliases,Candidate Ranking Algorithm.
Performance Analysis of HBase on HDFS- A Column-Oriented Database Approach
Jatinder Kaur Gurleen Kaur Dhaliwal
43 / 45
ICACC-2012
Abstract: HBase is the open source distributed storage system for the management of largevolumes of unstructured data. It is based on Google’s BigTable. It is non-SQL database systemwhich provides an alternative way of storage as compared to traditional RDMS systems. Thispaper explores and evaluates HBase data storage on Hadoop Distributed file system (HDFS).The current work’s purpose is to convert simple relational schema into HBase’s column-orienteddata store and then evaluate the performances of random writing and random reading of rows.
Keywords: HBase, Hadoop, HDFS, non SQL database, key, value.
SECURITY OF LFSR BASED STREAM CIPHERS USING GENETIC ALGORITHM
Prof Rupinder Singh
Prof Jagjit Singh
Prof Lovejeet Singh
ABSTRACT
44 / 45
ICACC-2012
Stream Ciphers: It is a symmetric key cipher where stream of plaintext are mixed with a randomcipher bit stream (key stream), typically by any logical operation. In the case of stream cipherone byte is usually encrypted at a particular time. In this paper, we combine two technologies,one is Linear Feedback Shift Register (LFSR) and other is Genetic Algorithm (GA). In thisresearch we are proposing to develop an algorithm which encrypts our plain text into cipher textby using Genetic Algorithm. A cipher is a secret method of writing, whereby plaintext istransformed into cipher text. The process of transforming plaintext into cipher text is calledencryption; the reverse process of transforming cipher text into plaintext is called decryption.Both encryption and decryption are controlled by cryptographic key parameters. Hence we areusing linear feedback shift register with Genetic Algorithm. In this paper we propose a schemewherein utilization of GA is explained with the help of LFSR. Here we have to develop twoalgorithms, one for sender side and another for receiver side.
Keywords: Cryptography; Cryptanalysis; Genetic Programming; Genetic Algorithm; LinearFeedback Shift Register (LFSR); RC4 Algorithm; Crossover; Mutation.
45 / 45