72
Software blueprint From Wikipedia, the free encyclopedia Jump to: navigation , search A software blueprint is the final product of a software blueprinting process. Its name derives from the analogy drawn with the popular use of the term blueprint (within traditional construction industry). Therefore, a true software blueprint should share a number of key properties with its building- blueprint counterpart: Contents 1 Properties common to blueprints o 1.1 Step-by-step procedure from blueprint to finished article o 1.2 Focused on a single application aspect o 1.3 Selection of optimal description medium o 1.4 Localization of aspect logic o 1.5 Orthogonalization 2 Examples o 2.1 GUI form design o 2.2 Machine translatable co-ordination languages (e.g. CDL) o 2.3 Class designers o 2.4 Software designers 3 See also 4 External links Properties common to blueprints Step-by-step procedure from blueprint to finished article Software blueprinting processes advocate containing inspirational activity (problem solving) as much as possible to the early stages of a project in the same way that the construction

Software engineering

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Software engineering

Software blueprintFrom Wikipedia, the free encyclopediaJump to: navigation, search

A software blueprint is the final product of a software blueprinting process. Its name derives from the analogy drawn with the popular use of the term blueprint (within traditional construction industry). Therefore, a true software blueprint should share a number of key properties with its building-blueprint counterpart:

Contents

1 Properties common to blueprints o 1.1 Step-by-step procedure from blueprint to finished article o 1.2 Focused on a single application aspect o 1.3 Selection of optimal description medium o 1.4 Localization of aspect logic o 1.5 Orthogonalization

2 Examples o 2.1 GUI form design o 2.2 Machine translatable co-ordination languages (e.g. CDL) o 2.3 Class designers o 2.4 Software designers

3 See also 4 External links

Properties common to blueprints

Step-by-step procedure from blueprint to finished article

Software blueprinting processes advocate containing inspirational activity (problem solving) as much as possible to the early stages of a project in the same way that the construction blueprint captures the inspirational activity of the construction architect. Following the blueprinting phase only procedural activity (following prescribed steps) is required. This means that a software blueprint must be prescriptive and therefore exhibit the same formality as other prescriptive languages such as C++ or Java. Software blueprinting exponents claim that this provides the following advantages over enduring inspiration:

Potential for automatic machine translation to code Predictable timescales after blueprinting phase Software architect's intentions reflected directly in code

Focused on a single application aspect

Page 2: Software engineering

Software blueprints focus on one aspect to avoid becoming diluted by compromising choice of description medium and to ensure that all of the relevant logic is localized.

Selection of optimal description medium

The single aspect focus of a software blueprint means that the optimal description medium can be selected. For example, algorithmic code may be best represented using textual code whereas GUI appearance may be best represented using a form design.

The motivation behind selecting an intuitive description medium (i.e. one that matches well with mental models and designs for a particular aspect) is to improve:

Ease of navigation Ease of understanding Fault detection rate Ability to manage complexity

Localization of aspect logic

The localization of aspect logic promoted by the software blueprinting approach is intended to improve navigability and this is based on the assumption that the application programmer most commonly wishes to browse application aspects independently.

Orthogonalization

Software blueprinting relies on realizing a clean separation between logically orthogonal aspects to facilitate the localization of related logic and use of optimal description media described above.

Examples

GUI form design

The GUI form design (see GUI toolkit) is widely adopted across the software industry and allows the programmer to specify a prescriptive description of the appearance of GUI widgets within a window. This description can be translated directly to the code that draws the GUI (because it is prescriptive).

Machine translatable co-ordination languages (e.g. CDL)

Languages such as the Concurrent Description Language (CDL) separate an application's macroscopic logic (communication, synchronization and arbitration) from complex multi-threaded and/or multi-process applications into a single contiguous visual representation. The prescriptive nature of this description means that it can be machine translated into an executable

Page 3: Software engineering

framework that may be tested for structural integrity (detection of race conditions, deadlocks etc.) before the microscopic logic is available.

Class designers

Class designers allow the specification of arbitrarily complex data structures in a convenient form and the prescriptive nature of this description allows generation of executable code to perform list management, format translation, endian swapping and so on.

Software designers

Classes are used as building blocks by software designers to model more complex structures. In software architecture the Unified Modeling Language (UML) is an industry standard used for modeling the blueprint of software. UML represents structure, associations and interactions between various software elements, like classes, objects or components. It helps the software designer to design, analyze and communicate ideas to other members of the software community.

Blueprint was born out of frustration with development environments, deployment processes, and the complexity of configuration management systems.

Blueprint insists development environments realistically model production and that starts with using Linux. Blueprint only works on Debian- or Red Hat-based Linux systems. We recommend VirtualBox, Vagrant, Rackspace Cloud, or AWS EC2 for development systems that use the same operating system (and version) as is used in production.

On top of the operating system, we recommend using the same web servers, databases, message queue brokers, and other software in development and production. This brings development visibility to entire classes of bugs that only occur due to interactions between production components.

When development and production share the same operating system and software stack, they also share the same interactive management tools, meaning developers and operators alike don’t need to maintain two vocabularies. Well-understood tools like apt-get/dpkg, yum/rpm, and the whole collection of Linux system tools are available everywhere. Blueprint is unique relative to other configuration management in encouraging use of these tools.

What’s common to all configuration management tools is the desire to manage the whole stack: from the operating system packages and services through language-specific packages, all the way to your applications. We need to span all of these across all our systems. To pick on RubyGems arbitrarily: RubyGems is purposely ignorant of its relationship to the underlying system, favoring compatibility with Windows and a wide variety of UNIX-like operating

Page 4: Software engineering

systems. Blueprint understands the macro-dependencies between RubyGems itself and the underlying system and is able to predictably reinstall a selection of gems on top of a properly configured operating system.

When constructing this predictable order-of-operations used to reinstall files, packages, services, and source installations, Blueprint, along with other configuration management tools, takes great care in performing idempotent actions. Thus Blueprint prefers to manage the entire contents of a file rather than a diff or a line to append. Idempotency means you can apply a blueprint over and over again with confidence that nothing will change if nothing needs to change.

Because Blueprint can reverse-engineer systems, it is of particular use migrating legacy systems into configuration management. It doesn’t matter when you install Blueprint: changes made to the system even before Blueprint is installed will be taken into account.

It is just a blueprint, a representation of what the software will be like or how the software will perform

FEASIBILITY STUDY – SOFTWARE ENGINEERING

FEASIBILITY STUDY – SOFTWARE ENGINEERING

A feasibility study is carried out to select the best system that meets performance requirements.The main aim of the feasibility study activity is to determine whether it would be financially and technically feasible to develop the product. The feasibility study activity involves the analysis of the problem and collection of all relevant information relating to the product such as the different data items which would be input to the system, the processing required to be carried out on these data, the output data required to be produced by the system as well as various constraints on the behaviour of the system.

Technical FeasibilityThis is concerned with specifying equipment and software that will successfully satisfy the user requirement. The technical needs of the system may vary considerably, but might include :• The facility to produce outputs in a given time.• Response time under certain conditions.

Page 5: Software engineering

• Ability to process a certain volume of transaction at a particular speed.• Facility to communicate data to distant locations.In examining technical feasibility, configuration of the system is given more importance than the actual make of hardware. The configuration should give the complete picture about the system’s requirements:How many workstations are required, how these units are interconnected so that they could operate and communicate smoothly.What speeds of input and output should be achieved at particular quality of printing.

Economic FeasibilityEconomic analysis is the most frequently used technique for evaluating the effectiveness of a proposed system. More commonly known as Cost / Benefit analysis, the procedure is to determine the benefits and savings that are expected from a proposed system and compare them with costs. If benefits outweigh costs, a decision is taken to design and implement the system. Otherwise, further justification or alternative in the proposed system will have to be made if it is to have a chance of being approved. This is an outgoing effort that improves in accuracy at each phase of the system life cycle.

Operational FeasibilityThis is mainly related to human organizational and political aspects. The points to be considered are:• What changes will be brought with the system?• What organizational structure are disturbed?• What new skills will be required? Do the existing staff members have these skills? If not, can they be trained in due course of time?This feasibility study is carried out by a small group of people who are familiar with information system technique and are skilled in system analysis and design process.Proposed projects are beneficial only if they can be turned into information system that will meet the operating requirements of the organization. This test of feasibility asks if the system will work when it is developed and installed. Posted by Sreejith at 9:18 PM Labels: analysis, economic analysis, economical, engineering, feasibility, feasibility study, operational, sad, software, software engineering, technical, technically feasible

Feasibility :-Feasibility is a practical extent to which a project can be performed successfully. To evaluate feasibility, a feasibility study is performed, which determines whether the solution considered to accomplish the requirements is practical and workable in the software or not. Such information as resource availability, cost estimate for software development, benefits of the software to organization, and cost to be incurred on its maintenance are considered. The objective of the feasibility study is to establish the reasons for developing a software that is acceptable to users, adaptable to change, and comfortable to aquablished standards.

Page 6: Software engineering

Types of Feasibility: - various types of feasibility that are commonly considered include technical feasibility, operational feasibility and economic feasibility.Technical Feasibility: - Technical Feasibility assesses the current resources and technology, which are required to accomplish user requirement in the software within the allocated time and For this, the software development team ascertains whether the current resources and technology can be upgraded or added in the software to accomplish specified user requirements. Technical feasibility performs the following tasks:- > It analyses the technical and capabilities of the software development team members.> It determines whether the relevant technology is stable and established.> It ascertains that the technology chosen for software development has large number of user so that they can be consulted when problems arise, or when improvements are required.Operational Feasibility: - Operational feasibility assesses the extent to which the required software performs a series of steps to solve business problems and user requirements. This feasibility is dependent on human resource and involves visualizing whether or not the software will operate after it is developed, and be operated once it is installed. It also performs the following tasks:-> It determines whether or on Economic Feasibility: - Economic feasibility determines whether the required software is capable of generating financial gains for an organization. It involves the cost incurred on the software development team, estimated cost of hardware and software, cost of performing feasibility study, and so on. For this, it is essential to consider expenses made on purchases and activities required to carry out software development. In addition it is necessary to consider the benefits that can be achieved by developing the software.> Cost incurred on software development to produce long-term gains for an organization.> Cost required to conduct full software investigation.> Cost of hardware, software, development team and training.

Feasibility Study Process :-

Feasibility study comprises the following steps :-1. Information assessement :-Identifies information about whether the system helps in achieving the objectives of the organizatio. It also verifies that the system can be implemented using new technology and within the budget, and whether the system can be integrated with the existing system.2. Information collection :- Specifies the sources from where information about software can be obtained. Generally, these sources include users, and the software developement team.3. Report writing :- Uses a feasibility report, which is the conclusion of the feasibility by the software developement team. In includes the recommendation whether the software development should continue or not.

Thanks,

Page 7: Software engineering

TESTING

Software testingFrom Wikipedia, the free encyclopediaJump to: navigation, search

Software development process

A software engineer programming at work

Activities and steps

Requirements

Specification

Architecture

Construction

Design

Testing

Debugging

Deployment

Maintenance

Methodologies

Waterfall

Prototype model

Incremental

Iterative

V-Model

Spiral

Page 8: Software engineering

Scrum

Cleanroom

RAD

DSDM

RUP

XP

Agile

Lean

Dual Vee Model

TDD

FDD

Supporting disciplines

Configuration management

Documentation

Quality assurance (SQA)

Project management

User experience design

Tools

Compiler

Debugger

Profiler

GUI designer

IDE

Build automation

v

t

e

Software testing is an investigation conducted to provide stakeholders with information about the quality of the product or service under test.[1] Software testing can also provide an objective, independent view of the software to allow the business to appreciate and understand the risks of software implementation. Test techniques include, but are not limited to, the process of executing a program or application with the intent of finding software bugs (errors or other defects).

Page 9: Software engineering

Software testing can be stated as the process of validating and verifying that a computer program/application/product:

meets the requirements that guided its design and development, works as expected, can be implemented with the same characteristics, and satisfies the needs of stakeholders.

Software testing, depending on the testing method employed, can be implemented at any time in the development process. Traditionally most of the test effort occurs after the requirements have been defined and the coding process has been completed, but in the Agile approaches most of the test effort is on-going. As such, the methodology of the test is governed by the chosen software development methodology.

Different software development models will focus the test effort at different points in the development process. Newer development models, such as Agile, often employ test-driven development and place an increased portion of the testing in the hands of the developer, before it reaches a formal team of testers. In a more traditional model, most of the test execution occurs after the requirements have been defined and the coding process has been completed.

Contents

1 Overview o 1.1 Defects and failures

2 Input combinations and preconditions 3 Economics

o 3.1 Roles 4 History 5 Testing methods

o 5.1 Static vs. dynamic testing o 5.2 The box approach

5.2.1 White-Box testing 5.2.2 Black-box testing 5.2.3 Grey-box testing

o 5.3 Visual testing 6 Testing levels

o 6.1 Unit testing o 6.2 Integration testing o 6.3 System testing o 6.4 Acceptance testing

7 Testing approach o 7.1 Top-down and bottom-up

8 Objectives of testing o 8.1 Installation testing o 8.2 Compatibility testing

Page 10: Software engineering

o 8.3 Smoke and sanity testing o 8.4 Regression testing o 8.5 Acceptance testing o 8.6 Alpha testing o 8.7 Beta testing o 8.8 Functional vs non-functional testing o 8.9 Destructive testing o 8.10 Software performance testing o 8.11 Usability testing o 8.12 Accessibility o 8.13 Security testing o 8.14 Internationalization and localization o 8.15 Development testing

9 The testing process o 9.1 Traditional CMMI or waterfall development model o 9.2 Agile or Extreme development model o 9.3 A sample testing cycle

10 Automated testing o 10.1 Testing tools o 10.2 Measurement in software testing

11 Testing artifacts 12 Certifications 13 Controversy 14 Related processes

o 14.1 Software verification and validation o 14.2 Software quality assurance (SQA)

15 See also 16 References 17 Further reading 18 External links

Overview

Testing can never completely identify all the defects within software.[2] Instead, it furnishes a criticism or comparison that compares the state and behavior of the product against oracles—principles or mechanisms by which someone might recognize a problem. These oracles may include (but are not limited to) specifications, contracts,[3] comparable products, past versions of the same product, inferences about intended or expected purpose, user or customer expectations, relevant standards, applicable laws, or other criteria.

A primary purpose of testing is to detect software failures so that defects may be discovered and corrected. Testing cannot establish that a product functions properly under all conditions but can only establish that it does not function properly under specific conditions.[4] The scope of software testing often includes examination of code as well as execution of that code in various

Page 11: Software engineering

environments and conditions as well as examining the aspects of code: does it do what it is supposed to do and do what it needs to do. In the current culture of software development, a testing organization may be separate from the development team. There are various roles for testing team members. Information derived from software testing may be used to correct the process by which software is developed.[5]

Every software product has a target audience. For example, the audience for video game software is completely different from banking software. Therefore, when an organization develops or otherwise invests in a software product, it can assess whether the software product will be acceptable to its end users, its target audience, its purchasers, and other stakeholders. Software testing is the process of attempting to make this assessment.

Defects and failures

Not all software defects are caused by coding errors. One common source of expensive defects is caused by requirement gaps, e.g., unrecognized requirements, that result in errors of omission by the program designer.[6] A common source of requirements gaps is non-functional requirements such as testability, scalability, maintainability, usability, performance, and security.

Software faults occur through the following processes. A programmer makes an error (mistake), which results in a defect (fault, bug) in the software source code. If this defect is executed, in certain situations the system will produce wrong results, causing a failure.[7] Not all defects will necessarily result in failures. For example, defects in dead code will never result in failures. A defect can turn into a failure when the environment is changed. Examples of these changes in environment include the software being run on a new computer hardware platform, alterations in source data, or interacting with different software.[7] A single defect may result in a wide range of failure symptoms.

Input combinations and preconditions

A very fundamental problem with software testing is that testing under all combinations of inputs and preconditions (initial state) is not feasible, even with a simple product.[4][8] This means that the number of defects in a software product can be very large and defects that occur infrequently are difficult to find in testing. More significantly, non-functional dimensions of quality (how it is supposed to be versus what it is supposed to do)—usability, scalability, performance, compatibility, reliability—can be highly subjective; something that constitutes sufficient value to one person may be intolerable to another.

Software developers can't test everything, but they can use combinatorial test design to identify the minimum number of tests needed to get the coverage they want. Combinatorial test design enables users to get greater test coverage with fewer tests. Whether they are looking for speed or test depth, they can use combinatorial test design methods to build structured variation into their test cases.[9]

Economics

Page 12: Software engineering

A study conducted by NIST in 2002 reports that software bugs cost the U.S. economy $59.5 billion annually. More than a third of this cost could be avoided if better software testing was performed.[10]

It is commonly believed that the earlier a defect is found, the cheaper it is to fix it. The following table shows the cost of fixing the defect depending on the stage it was found.[11] For example, if a problem in the requirements is found only post-release, then it would cost 10–100 times more to fix than if it had already been found by the requirements review. With the advent of modern continuous deployment practices and cloud-based services, the cost of re-deployment and maintenance may lessen over time.

Cost to fix a defectTime detected

Requirements Architecture ConstructionSystem

testPost-

release

Time introduced

Requirements 1× 3× 5–10× 10× 10–100×Architecture - 1× 10× 15× 25–100×Construction - - 1× 10× 10–25×

Roles

Software testing can be done by software testers. Until the 1980s, the term "software tester" was used generally, but later it was also seen as a separate profession. Regarding the periods and the different goals in software testing,[12] different roles have been established: manager, test lead, test analyst, test designer, tester, automation developer, and test administrator.

History

The separation of debugging from testing was initially introduced by Glenford J. Myers in 1979.[13] Although his attention was on breakage testing ("a successful test is one that finds a bug"[13]

[14]) it illustrated the desire of the software engineering community to separate fundamental development activities, such as debugging, from that of verification. Dave Gelperin and William C. Hetzel classified in 1988 the phases and goals in software testing in the following stages:[15]

Until 1956 - Debugging oriented[16]

1957–1978 - Demonstration oriented[17]

1979–1982 - Destruction oriented[18]

1983–1987 - Evaluation oriented[19]

1988–2000 - Prevention oriented[20]

Testing methods

Static vs. dynamic testing

There are many approaches to software testing. Reviews, walkthroughs, or inspections are referred to as static testing, whereas actually executing programmed code with a given set of test

Page 13: Software engineering

cases is referred to as dynamic testing. Static testing can be omitted, and unfortunately in practice often is. Dynamic testing takes place when the program itself is used. Dynamic testing may begin before the program is 100% complete in order to test particular sections of code and are applied to discrete functions or modules. Typical techniques for this are either using stubs/drivers or execution from a debugger environment.

The box approach

Software testing methods are traditionally divided into white- and black-box testing. These two approaches are used to describe the point of view that a test engineer takes when designing test cases.

White-Box testing

Main article: White-box testing

White-box testing (also known as clear box testing, glass box testing, transparent box testing, and structural testing) tests internal structures or workings of a program, as opposed to the functionality exposed to the end-user. In white-box testing an internal perspective of the system, as well as programming skills, are used to design test cases. The tester chooses inputs to exercise paths through the code and determine the appropriate outputs. This is analogous to testing nodes in a circuit, e.g. in-circuit testing (ICT).

While white-box testing can be applied at the unit, integration and system levels of the software testing process, it is usually done at the unit level. It can test paths within a unit, paths between units during integration, and between subsystems during a system–level test. Though this method of test design can uncover many errors or problems, it might not detect unimplemented parts of the specification or missing requirements.

Techniques used in white-box testing include:

API testing (application programming interface) - testing of the application using public and private APIs

Code coverage - creating tests to satisfy some criteria of code coverage (e.g., the test designer can create tests to cause all statements in the program to be executed at least once)

Fault injection methods - intentionally introducing faults to gauge the efficacy of testing strategies

Mutation testing methods Static testing methods

Code coverage tools can evaluate the completeness of a test suite that was created with any method, including black-box testing. This allows the software team to examine parts of a system that are rarely tested and ensures that the most important function points have been tested.[21] Code coverage as a software metric can be reported as a percentage for:

Page 14: Software engineering

Function coverage, which reports on functions executed Statement coverage, which reports on the number of lines executed to complete

the test

100% statement coverage ensures that all code paths, or branches (in terms of control flow) are executed at least once. This is helpful in ensuring correct functionality, but not sufficient since the same code may process different inputs correctly or incorrectly.

Black-box testing

Main article: Black-box testing

Black box diagram

Black-box testing treats the software as a "black box", examining functionality without any knowledge of internal implementation. The tester is only aware of what the software is supposed to do, not how it does it.[22] Black-box testing methods include: equivalence partitioning, boundary value analysis, all-pairs testing, state transition tables, decision table testing, fuzz testing, model-based testing, use case testing, exploratory testing and specification-based testing.

Specification-based testing aims to test the functionality of software according to the applicable requirements.[23] This level of testing usually requires thorough test cases to be provided to the tester, who then can simply verify that for a given input, the output value (or behavior), either "is" or "is not" the same as the expected value specified in the test case. Test cases are built around specifications and requirements, i.e., what the application is supposed to do. It uses external descriptions of the software, including specifications, requirements, and designs to derive test cases. These tests can be functional or non-functional, though usually functional.

Specification-based testing may be necessary to assure correct functionality, but it is insufficient to guard against complex or high-risk situations.[24]

One advantage of the black box technique is that no programming knowledge is required. Whatever biases the programmers may have had, the tester likely has a different set and may emphasize different areas of functionality. On the other hand, black-box testing has been said to be "like a walk in a dark labyrinth without a flashlight."[25] Because they do not examine the source code, there are situations when a tester writes many test cases to check something that could have been tested by only one test case, or leaves some parts of the program untested.

This method of test can be applied to all levels of software testing: unit, integration, system and acceptance. It typically comprises most if not all testing at higher levels, but can also dominate unit testing as well.

Page 15: Software engineering

What's Tangible Software Engineering Education?

Taichi NakamuraDirector, Tangible Software Engineering Education and Research Project

Tokyo University of Technology, Tokyo, [email protected]

1. Introduction

Computer systems have infiltrated many fields such as finance, distribution, manufacturing, education and electronic government and must be safe and secure. On the other hand, the demands of customers are subject to bewilderingly change, corresponding to the rapid progress of information technologies and changes in the business environment. Enterprises had to cultivate human resources with highly competent skills in information technology. However, enterprises have recently been urged to be selective and to focus their investment in a global competitive setting. Therefore, IT industry has been asking universities to promote the development of advanced IT talent in their students [1].A decrease in the wish to study IT by the young people who could become the human resources with the required talent has become a big problem. Universities need to work immediately on the establishment of a method of practical software education which is improved to the level that can be used by businesses and on the development of the associated teaching material.The tangible software education research project won the support of the private university science research upgrade promotion business of the Ministry of Education, Culture, Sports, Science and Technology in fiscal year 2007, and began work at the Open Research Center which was set up by Tokyo University of Technology to deal with such a situation. The purpose of the project is to develop the teaching material and an education method that will promote the development of professional talent which has a high degree of professionalism in the rapidly changing field of information technologies. This paper describes the principles of software engineering education led by the

Page 16: Software engineering

educational philosophy of Tokyo University of Technology, the issues of software engineering education, and the purpose of the research.

2. The Idea of Software Engineering Education

2.1 Principles of the SE course in light of the University's principles

Tokyo University of Technology has three stated principles: (1) theoretical and technical education of professions contributing to society, (2) education through cutting-edge R&D and the social application of research results,and (3) the development of an ideal environment for an ideal education and research.The project established the principle in software engineering education as "Develop human resources with design and development abilities that enable them to build software after analyzing and modeling the customers' needs, and with the adaptability and management ability to handle their role as a team member under various constraints" in order to realize the first principle of the university as shown in Figure 1. In order to apply this principle, the project is designing and developing software engineering education curricula based on Instructional Design (ID). One of the aims of the project is to realize PBE (Profile Based Education), which is a method of providing educational materials and instructional approach according to each student's learning curve.

Figure 1. Principle of the Software engineering course in light of the university's principles

2.2 Design of the curriculum for software engineering based on Instructional Design

The course curriculum should be designed systematically according to the Analysis, Design, Development, Implementation, and Evaluation (ADDIE) model. The most

Page 17: Software engineering

important aspect of the process of designing the curriculum is that students define the objectives to be attained with the quantitative index measuring the acquired level of human-oriented knowledge about communication or team-building from criteria judging whether the knowledge is effective in a business environment.

2.3 The system of practical education

It is more important for practitioners to acquire the design methods and management methods required in each phase of the entire system development process rather than to explore such techniques on statistics and psychology. Practical education to systematically acquire knowledge of software engineering involves repeatedly studying the design methods and the management methods which a software engineer should use in every development process. Figure 2 shows a structure of PBL that employs practical education cycles in each development process.The universities provide a scenario-based curriculum that requires users to design or manage the development process in virtual projects and such virtual projects have the advantage of providing many more simulated experiences than OJT. The curriculum has been designed with an emphasis on the importance of a process in which a learner solves exercises that are produced for each phase of the development process for the virtual project. This resulting process is a fusion of a PBE oriented instructional design cycle and a practical education cycle.<="" font="">

Page 18: Software engineering

Figure 2. A structure of PBL that employs practical education cycles in each development process

3. Quaternary tangible software education

The tangible software engineering education and research project should address following issues. New students arrive at the departments of information engineering at universities every year with a variety of ambitions. They may be keen to study programming in order to be able to make game software; or they may secretly aspire to participate actively as a system engineer in the future. Surprisingly, within six months of entering the university their dreams are smashed to smithereens and they lose their motivation to study software engineering. Of course it is also true that quite a lot of students, who are living in a modern mature society where they can easily get everything they want, have not already formed any hope nor dream. On the other hand information systems have been growing in both scale and complexity. High competent persons who have received advanced education in IT and are highly motivated by their responsibility as a member of a system development project are required in order to support information system. Whatever their current situation, we have to encourage such

Page 19: Software engineering

young people to become involved in the information infrastructure supporting our society. One of the most important conditions for providing software engineering education which will be effective in solving this kind of problem is to motivate students to study IT and then to keep their motivation alive. Four tangible issues which education staff have to deal with in order to satisfy students' needs in software engineering education are as follows: 1) A tangible curriculumIn order for students to know the purpose of the course and to be able to plan their careers, course curricula should be designed by referring to the information technology skill standards (ITSS) defined by the Ministry of Economy, Trade and Industry and enable students to acquire skills related to their future occupation. 2) Tangible lecturesIn order for students to understand the lectures and to be motivated to study IT, experienced-based training should be provided to students in the course. Students can share awareness of issues relating to information systems. Students can subsequently learn how to understand abstract concepts and logical thinking in order to gain a deep understanding and to develop skills.3) Tangible relationship with the IT industryThe significant difference between skills learned in a classroom and skills required to develop an information system in a practical situation has been pointed. We develop competent human resources who have the following practical abilities: analyze a customer's requirements; build a system model; design and implement a software system which meets the customer's requirements; have an aptitude for working in a team under various real-life constraints; and manage money, time, and people.4) Tangible profile of a studentThe behavioral track records of each learner are gathered during class and are analyzed. The relation between the track records and the level of an acquired skill for each learner should be formulated in order for teaching staff to be able to know quantitatively the acquired skill level by using a formula derived from the relation. To realize PBE, the teaching staff have to fit the teaching materials to the students and to establish appropriate teaching methods according to their learning curves drawn by the formula. In order to raise a student's motivation, maintains it, and indeed causes it to increase, the teaching staff present students with the whole picture of an information system, lets students become interested in the information system, provides a comprehensible explanation to the students, and lets students then have confidence in developing the system. As a result, they have a feeling of satisfaction.

4. Tangible software engineering education and research

4.1. The purpose of Tangible software engineering education and research

Page 20: Software engineering

The tangible software engineering education and research project aims to realize software engineering education for the new era with the following two features. 1) The practical software engineering education takes students through the entire life cycle from requirements definition, through design, implementation and testing, to operation and maintenance, and bridges the gap between classroom and field practice. 2) The software engineering education cultivates skills which infuse students with the joy of achievement through the experience of actually writing and running programs. We are developing a curriculum system for practical software engineering by using the actual achievement of tangible software engineering education, and providing a project based learning (PBL) environment and teaching materials [2][3].

4.2. Topics of research

The tangible software engineering education and research project develops an educational method for improving human management related-skills such as communication, and leadership, which are necessary for a learner to contribute to team work. In particular, we monitor the behavior of each learner during a lecture, analyze the monitored behavioral track records, and provide the appropriate instructional approach and teaching materials to each learner [3]. Role-play training is one of the most effective forms of PBL and an appropriate learning method for education in project management which requires collaborative working with multiple stakeholders. The scenario is the most important factor in role-play training, because the progress of a role-play is affected by the behavior of learners who are each playing a role and operate the role-play according to the scenario [4]. A web application system similar to a role-play game has been built in order for learners to be able to carry out the role-play training anywhere and anytime [2].

4.2.1. Profile based software engineering education(1) Personal profile based software engineering educationWe are developing the teaching method for the new era with the following features: taking account of the unique learning curve of each learner; looking beyond the scope of the existing curriculum; flexibly varying the context of the teaching materials to adapt them to the learner's skill level. In order to achieve personal PBE, the profile data of each learner must be created, including the behavioral track record of the learner and a record of the times taken to acquire different skill levels. In addition, the features of the character of each learner are added to the created profile data [2].(2) Team profile based software engineering educationFor a learner to experience team work, it is desirable that a team should have a system with at least an advanced professional. We extract the behavioral track records of the team interacting with the agent for the team that achieved the best performance in the

Page 21: Software engineering

role-play exercise. The collaborative model for project management is created using the best behavioral records and the individual profile data of the team members. This model can also present a guideline for actions for a team to achieve the best performance.

4.2.2. Role-play training environment

(1) Method for developing a role-play scenarioThe role-play scenario describes the problems which a learner should solve by using the management skills taught in a class, the cause of the problems, and the change in the circumstances of a virtual project which have given rise to the problem with the cause [4]. Points to remember when developing a role-play scenario are as follows,1) The service being developed in the virtual project should be familiar to the learners,2) The level of difficulty of an exercise should be matched to the student's skill level,3) Contradictions must be eliminated,4) The instructions that learners refer to undertake the role-play should be written clearly,5) The actions learners execute in the role-play should be useful in providing a quantitative indication of the achieved skills of learners,6) An integrated development environment should be built to improve the productivity of developing role-play scenarios written in HTML and having XML tags [2][3].

(2) Role-play training with an agentA software agent system, which implements the advanced professional skills of project management and plays the role of a mentor in a role-play, may be useful in improving the learning effectiveness[5]. We are developing an agent system which includes not only an agent with the skills that are gained by experience but also various characters to provide training in human related skills.

(3)The role-play training systemWe have developed a web-based application system providing a role-play training facility [2]. The system consists of a Web server, an application server that implements the role-play execution core required to run a virtual project and provides user management, and the database server, as shown in Figure 3. The role-play execution core implements user administration, PROMASTER administration, the lobby selecting scenario, the role-play engine, and the feedback engine.

Page 22: Software engineering

Figure 3. Outline of the role-play training system, PROMASTER

4.3. Curriculum and teaching materials for tangible software engineering education

We are developing a four-year unified curriculum with teaching materials to realize tangible software engineering education for use not only in universities but also in high schools and industry.

4.4. Relationship between FDmO in the University and high schools and the IT industry

In the tangible software engineering education approach, competent engineers working in the field are asked to provide coaching during the practice in order for

Page 23: Software engineering

students to acquire a correct understanding of work in the IT industry at an early stage in their college life and ultimately to be able to find a satisfying job. In addition, the teaching materials we are developing in the project provide information about up-to-date trends in the software industry to teaching staff in high schools, in order to encourage more high school students to take an interest in IT and to be willing to work in the IT industry. Consequently the tangible software engineering education and research project will provide industry with advanced human resources by considering all stages from high school through university. The Software Engineering Education and research center (SEED) has been organized to formulate and maintain the software engineering education system, which closely connects universities to high schools, the IT industry, and research institutes. The project has also founded the Faculty Development management Office (FDmO) to liaise with other universities as shown in Figure 4.

Figure 4. Relationship between the FDmO in the University and high school and IT industry

Page 24: Software engineering

5. Conclusion

This paper has introduced the principle of tangible software engineering education and research. The activities of the project are significant, and are as follows: to provide highly competent software engineers who can design and implement advanced and complicated information systems and have practical abilities; and to contribute to the healthy development of the IT industry, not only in Japan but also around the world, by means of a good relationship between universities and the IT industry. Young people often have little interest in developing advanced skills because in the mature society of Japan they can already get everything they want. Few students want to have a job in the IT field, due to the lack of any means of letting them gain a vision for work in this field after graduation. The tangible software engineering education and research project aims to achieve a dramatic improvement in the problematic situation of software engineering education, and its activities and results obtained are extremely significant.

6. Acknowledgements

This research is supported by "Tangible Software Engineering (SE)Education and Research." as part of the "Program promoting the leveling of private university academic research," in which the Ministry of Education, Culture, Sports, Science and Technology invited public participation for the academic year 2007.

TangibilityFrom Wikipedia, the free encyclopediaJump to: navigation, search

This article does not cite any references or sources. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (December 2009)

Page 25: Software engineering

Look up tangible in Wiktionary, the free dictionary.

Tangibility is the attribute of being easily detectable with the senses.

In criminal law, one of the elements of an offense of larceny is that the stolen property must be tangible.

In the context of intellectual property, expression in tangible form is one of the requirements for copyright protection.

Tangible propertyFrom Wikipedia, the free encyclopediaJump to: navigation, search

Tangible property in law is, literally, anything which can be touched, and includes both real property and personal property (or moveable property), and stands in distinction to intangible property.[citation needed]

In English law and some Commonwealth legal systems, items of tangible property are referred to as choses in possession (or a chose in possession in the singular). However, some property, despite being physical in nature, is classified in many legal systems as intangible property rather than tangible property because the rights associated with the physical item are of far greater significance than the physical properties. Principally, these are documentary intangibles. For example, a promissory note is a piece of paper that can be touched, but the real significance is not the physical paper, but the legal rights which the paper confers, and hence the promissory note is defined by the legal debt rather than the physical attributes.[1]

A unique category of property is money, which in some legal systems is treated as tangible property and in others as intangible property. Whilst most countries legal tender is expressed in the form of intangible property ("The Treasury of Country X hereby promises to pay to the bearer on demand...."), in practice bank notes are now rarely ever redeemed in any country, which has led to bank notes and coins being classified as tangible property in most modern legal systems.

References

Page 26: Software engineering

1. ̂ Hon. Giles, J. (May 1, 2008). "R&L ZOOK, INC., d/b/a, t/a, aka UNITED CHECK CASHING COMPANY, Plaintiff, v. PACIFIC INDEMNITY COMPANY, Defendant." (PDF). paed.uscourts.gov. Philadelphia, PA: United States District Court Eastern District of Pennsylvania. p. 6. Archived from the original on 2008-10-05. Retrieved 2011-07-11.

Tangible user interfaceFrom Wikipedia, the free encyclopedia  (Redirected from Tangible media)Jump to: navigation, search

This article includes inline citations, but they are not properly formatted. Please improve this article by correcting them. (October 2012)

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which no longer is used.

One of the pioneers in tangible user interfaces is Hiroshi Ishii, a professor in the MIT Media Laboratory who heads the Tangible Media Group. His particular vision for tangible UIs, called Tangible Bits, is to give physical form to digital information, making bits directly manipulable and perceptible. Tangible bits pursues seamless coupling between these two very different worlds of bits and atoms.

Contents

1 Characteristics of tangible user interfaces 2 Examples 3 State of the art 4 See also 5 References 6 External links

Characteristics of tangible user interfaces

1. Physical representations are computationally coupled to underlying digital information.2. Physical representations embody mechanisms for interactive control.

Page 27: Software engineering

3. Physical representations are perceptually coupled to actively mediated digital representations.

4. Physical state of tangibles embodies key aspects of the digital state of a system

According to,[1] five basic defining properties of tangible user interfaces are as follows:

1. space-multiplex both input and output;2. concurrent access and manipulation of interface components;3. strong specific devices;4. spatially aware computational devices;5. spatial re-reconfigurability of devices.

Examples

An example of a tangible UI is the Marble Answering Machine by Durrell Bishop (1992). A marble represents a single message left on the answering machine. Dropping a marble into a dish plays back the associated message or calls back the caller.

Another example is the Topobo system. The blocks in Topobo are like LEGO blocks which can be snapped together, but can also move by themselves using motorized components. A person can push, pull, and twist these blocks, and the blocks can memorize these movements and replay them.

Another implementation allows the user to sketch a picture on the system's table top with a real tangible pen. Using hand gestures, the user can clone the image and stretch it in the X and Y axes just as one would in a paint program. This system would integrate a video camera with a gesture recognition system.

Another example is jive. The implementation of a TUI helped make this product more accessible to elderly users of the product. The 'friend' passes can also be used to activate different interactions with the product.

Several approaches have been made to establish a generic middleware for TUIs. They target toward the independence of application domains as well as flexibility in terms of the deployed sensor technology. For example, Siftables provides an application platform in which small gesture sensitive displays act together to form a human-computer interface.

For collaboration support TUIs have to allow the spatial distribution, asynchronous activities, and the dynamic modification of the TUI infrastructure, to name the most prominent ones. This approach presents a framework based on the LINDA tuple space concept to meet these requirements. The implemented TUIpist framework deploys arbitrary sensor technology for any type of application and actuators in distributed environments.

A further example of a type of TUI is a Projection Augmented model.

Page 28: Software engineering

State of the art

Since the invention of Durell Bishop's Marble Answering Machine (1992)[2] two decades ago, the interest in Tangible User Interfaces (TUIs) has grown constantly and with every year more tangible systems are showing up. In 1999 Gary Zalewski patented a system of moveable children's blocks containing sensors and displays for teaching spelling and sentence composition.[3] A similar system is being marketed as "Siftables".

The MIT Tangible Media Group, headed by Hiroshi Ishi is continuously developing and experimenting with TUIs including many tabletop applications.

The Urp[4] and the more advanced Augmented Urban Planning Workbench[5] allows digital simulations of air flow, shadows, reflections, and other data based on the positions and orientations of physical models of buildings, on the table surface.

Newer developments go even one step further and incorporate the third dimension by allowing to form landscapes with clay (Illuminating Clay[6]) or sand (Sand Scape[7]). Again different simulations allow the analysis of shadows, height maps, slopes and other characteristics of the interactively formable landmasses.

InfrActables[8] is a back projection collaborative table that allows interaction by using TUIs that incorporate state recognition. Adding different buttons to the TUIs enables additional functions associated to the TUIs. Newer Versions of the technology can even be integrated into LC-displays[9] by using infrared sensors behind the LC matrix.

The Tangible Disaster[10] allows the user to analyze disaster measures and simulate different kinds of disasters (fire, flood, tsunami,.) and evacuation scenarios during collaborative planning sessions. Physical objects ‚gpuckss’ allow positioning disasters by placing them on the interactive map and additionally tuning parameters (i.e. scale) using dials attached to them.

Apparently the commercial potential of TUIs has been identified recently. The repeatadly awarded Reactable,[11] an interactive tangible tabletop instrument, is now distributed commercially by Reactable Systems, a spinoff company of the Pompeu Fabra University, where it was developed. With the Reactable users can set up their own instrument interactively, by physically placing different objects (representing oscillators, filters, modulators... ) and parametrise them by rotating and using touch-input.

Microsoft is distributing its novel Windows-based platform Microsoft Surface[12] since last year. Beside multi touch tracking of fingers the platform supports the recognition of physical objects by their footprints. Several applications, mainly for the use in commercial space, have been presented. Examples reach from designing an own individual graphical layout for a snowboard or skateboard to studying the details of a wine in a restaurant by placing it on the table and navigating through menus via touch input. Also interactions like the collaborative browsing of photographs from a handycam or cell phone that connects seamlessly once placed on the table are supported.

Page 29: Software engineering

Another notable interactive installation is instant city[13] that combines gaming, music, architecture and collaborative aspects. It allows the user to build three dimensional structures and set up a city with rectangular building blocks, which simultaneously results in the interactive assembly of musical fragments of different composers.

The development of the Reactable and the subsequent release of its tracking technology reacTIVision[14] under the GNU/GPL as well as the open specifications of the TUIO protocol have triggered an enormous amount of developments based on this technology.

In the last few years also many amateur and semi-professional projects, beside academia and commerce have been started. Thanks to open source tracking technologies (reacTIVision[14]) and the ever increasing computational power available to end-consumers, the required infrastructure is nowadays accessible to almost everyone. A standard PC, a web-cam, and some handicraft work allow to setup tangible systems with a minimal programming and material effort. This opens doors to novel ways of perception of human-computer interaction and gives room for new forms of creativity for the broad public, to experiment and play with.

It is difficult to keep track and overlook the rapidly growing amount of all these systems and tools, but while many of them seem only to utilize the available technologies and are limited to some initial experiments and tests with some basic ideas or just reproduce existing systems, a few of them open out into novel interfaces and interactions and are deployed in public space or embedded in art installations.[15][16]

The Tangible Factory Planning[17] is a tangible table based on reacTIVision that allows to collaboratively plan and visualize production processes in combination with plans of new factory buildings and was developed within a diploma thesis.

Another of the many reacTIVision-based tabletops is ImpulsBauhaus-Interactive Table[18] and was on exhibition at the Bauhaus-University in Weimar marking the 90th anniversary of the establishment of Bauhaus. Visitors could browse and explore the biographies, complex relations and social networks between members of the movement.

Definition of 'Raw Materials'

A material or substance used in the primary production or manufacturing of a good. Raw materials are often natural resources such as oil, iron and wood. Before being used in the manufacturing process raw materials often are altered to be used in different processes. Raw

Page 30: Software engineering

materials are often referred to as commodities, which are bought and sold on commodities

exchanges around the world.

Investopedia explains 'Raw Materials'

Raw materials are sold in what is called the factor market. This is because raw materials are factors of production along with labor and capital. Raw materials are so important to the production process that the success of a country's economy can be determined by the amount of natural resources the country has within its own borders. A country that has abundant natural resources does not need to import as many raw materials, and has an opportunity to export the materials to other countries.

Definition

Basic substance in its natural, modified, or semi-processed state, used as an input to a production process for subsequent modification or transformation into a finished good.

Read more: http://www.businessdictionary.com/definition/raw-material.html#ixzz2LVXnrzUO

Software portabilityFrom Wikipedia, the free encyclopediaJump to: navigation, search

It has been suggested that Porting be merged into this article or section. (Discuss) Proposed since June 2012. This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (November 2011)

This article is about portability in itself. For the work required to make software portable, see Porting.

Portability in high-level computer programming is the usability of the same software in different environments. The prerequirement for portability is the generalized abstraction between the application logic and system interfaces. When software with the same functionality is produced for several computing platforms, portability is the key issue for development cost reduction.

Page 31: Software engineering

Contents

1 Strategies for portability o 1.1 Similar systems o 1.2 Different operating systems, similar processors o 1.3 Different processors

2 Recompilation 3 See also 4 Sources

Strategies for portability

Software portability may involve:

Transferring installed program files to another computer of basically the same architecture.

Reinstalling a program from distribution files on another computer of basically the same architecture.

Building executable programs for different platforms from source code; this is what is usually understood by "porting".

Similar systems

When operating systems of the same family are installed on two computers with processors with similar instruction sets it is often possible to transfer the files implementing program files between them.

In the simplest case the file or files may simply be copied from one machine to the other. However, in many case the software is installed on a computer in a way which depends upon its detailed hardware, software, and setup, with device drivers for particular devices, using installed operating system and supporting software components, and using different drives or directories.

In some cases software, usually described as "portable software" is specifically designed to run on different computers with compatible operating systems and processors without any machine-dependent installation; it is sufficient to transfer specified directories and their contents. Software installed on portable mass storage devices such as USB sticks can be used on any compatible computer on simply plugging the storage device in, and stores all configuration information on the removable device. Hardware- and software-specific information is often stored in configuration files in specified locations (the registry on machines running Microsoft Windows).

Software which is not portable in this sense will have to be transferred with modifications to support the environment on the destination machine.

Different operating systems, similar processors

Page 32: Software engineering

When the systems in question have compatible processors (usually x86-compatible processors on desktop computers), they will execute the low-level program instructions in the same manner, but the system calls are likely to differ between different operating systems. Later operating systems of UNIX heritage, including Linux, BSD, Solaris and OS X, are able to achieve a high degree of software portability by using the POSIX standard for calling OS functions. Such POSIX-based programs can be compiled for use in Windows by means of interface software such as Cygwin.

Different processors

As of 2011 the majority of desktop and laptop computers used microprocessors compatible with the 32- and 64-bit x86 instruction sets. Smaller portable devices use processors with different and incompatible instruction sets, such as ARM. The difference between larger and smaller devices is such that detailed software operation is different; an application designed to display suitably on a large screen cannot simply be ported to a pocket-sized smartphone with a tiny screen even if the functionality is similar.

Web applications are required to be processor independent, so portability can be achieved by using web programming techniques, writing in JavaScript. Such a program can run in a common web browser, which as of 2011 can be assumed to have a Java package containing the Java virtual machine and its Java Class Library. Such web applications must, for security reasons, have limited control over the host computer, especially regarding reading and writing files. Non-web programs, installed upon a computer in the normal manner, can have more control, and yet achieve system portability by linking to the Java package. By using Java bytecode instructions instead of processor-dependent machine code, maximum software portability is achieved. Programs need not be written in Java, as compilers for several other languages can generate Java bytecode: Jruby does it from Ruby programs, Jython from Python programs, and there are several others.

Recompilation

Software can be recompiled and linked from source code for different operating systems and processors if written in a programming language supporting compilation for the platforms. This is usually a task for the program developers; typical users have neither access to the source code nor the required skills.

In open-source environments such as Linux the source code is available to all. In earlier days source code was often distributed in a standardised format, and could be built into executable code with a standard Make tool for any particular system by moderately knowledgeable users if no errors occurred during the build. Some Linux distributions distribute software to users in source form. In these cases there is usually no need for detailed adaptation of the software for the system; it is distributed in a way which modifies the compilation process to match the system.

See also

Page 33: Software engineering

Hardware-dependent Software Java (software platform) Portability testing Platform Productisation

software portability Being able to move software from one machine platform to another. It refers to system

software or application software that can be recompiled for a different platform or to software that is available for two or more different platforms. See portable application. Contrast with data portability.

Software engineeringFrom Wikipedia, the free encyclopediaJump to: navigation, search

A software engineer programming for the Wikimedia Foundation

Software engineering (SE) is the application of a systematic, disciplined, quantifiable approach to the design, development, operation, and maintenance of software, and the study of these

Page 34: Software engineering

approaches; that is, the application of engineering to software.[1][2][3] In layman's terms, it is the act of using insights to conceive, model and scale a solution to a problem. The term software engineering first appeared in the 1968 NATO Software Engineering Conference and was meant to provoke thought regarding the perceived "software crisis" at the time.[4][5][6] Software development, a much used and more generic term, does not necessarily subsume the engineering paradigm. The generally accepted concepts of Software Engineering as an engineering discipline have been specified in the Guide to the Software Engineering Body of Knowledge (SWEBOK). The SWEBOK has become an internationally accepted standard ISO/IEC TR 19759:2005.[7]

For those who wish to become recognized as professional software engineers, the IEEE offers two certifications (Certified Software Development Associate and Certified Software Development Professional). The IEEE certifications do not use the term Engineer in their title for compatibility reasons. In some parts of the US such as Texas, the use of the term Engineer is regulated only to those who have a Professional Engineer license. Further, in the United States starting from 2013, the NCEES Professional Engineer licences are available also for Software Engineers.[8]

Contents

1 History 2 Profession

o 2.1 Employmento 2.2 Certificationo 2.3 Impact of globalization

3 Education 4 Comparison with other disciplines 5 Software Process

o 5.1 Models 5.1.1 Waterfall model

6 Subdisciplines 7 Related disciplines

o 7.1 Systems engineeringo 7.2 Computer software engineers

8 See also 9 Notes 10 References 11 Further reading 12 External links

History

Main article: History of software engineering

Page 35: Software engineering

When the first modern digital computers appeared in the early 1940s,[9] the instructions to make them operate were wired into the machine. Practitioners quickly realized that this design was not flexible and came up with the "stored program architecture" or von Neumann architecture. Thus the division between "hardware" and "software" began with abstraction being used to deal with the complexity of computing.

Programming languages started to appear in the 1950s and this was also another major step in abstraction. Major languages such as Fortran, ALGOL, and COBOL were released in the late 1950s to deal with scientific, algorithmic, and business problems respectively. E.W. Dijkstra wrote his seminal paper, "Go To Statement Considered Harmful",[10] in 1968 and David Parnas introduced the key concept of modularity and information hiding in 1972[11] to help programmers deal with the ever increasing complexity of software systems. A software system for managing the hardware called an operating system was also introduced, most notably by Unix in 1969. In 1967, the Simula language introduced the object-oriented programming paradigm.

These advances in software were met with more advances in computer hardware. In the mid 1970s, the microcomputer was introduced, making it economical for hobbyists to obtain a computer and write software for it. This in turn led to the now famous Personal Computer (PC) and Microsoft Windows. The Software Development Life Cycle or SDLC was also starting to appear as a consensus for centralized construction of software in the mid 1980s. The late 1970s and early 1980s saw the introduction of several new Simula-inspired object-oriented programming languages, including Smalltalk, Objective-C, and C++.

Open-source software started to appear in the early 90s in the form of Linux and other software introducing the "bazaar" or decentralized style of constructing software.[12] Then the World Wide Web and the popularization of the Internet hit in the mid 90s, changing the engineering of software once again. Distributed systems gained sway as a way to design systems, and the Java programming language was introduced with its own virtual machine as another step in abstraction. Programmers collaborated and wrote the Agile Manifesto, which favored more lightweight processes to create cheaper and more timely software.

The current definition of software engineering is still being debated by practitioners today as they struggle to come up with ways to produce software that is "cheaper, better, faster". Cost reduction has been a primary focus of the IT industry since the 1990s. Total cost of ownership represents the costs of more than just acquisition. It includes things like productivity impediments, upkeep efforts, and resources needed to support infrastructure.

Profession

Main article: Software engineer

Legal requirements for the licensing or certification of professional software engineers vary around the world. In the UK, the British Computer Society licenses software engineers and members of the society can also become Chartered Engineers (CEng), while in some areas of Canada, such as Alberta, Ontario,[13] and Quebec, software engineers can hold the Professional Engineer (P.Eng) designation and/or the Information Systems Professional (I.S.P.) designation.

Page 36: Software engineering

In Canada, there is a legal requirement to have P.Eng when one wants to use the title "engineer" or practice "software engineering". In the USA, beginning on 2013, the path for licensure of software engineers will become a reality. As with the other engineering disciplines, the requirements consist of earning an ABET accredited bachelor’s degree in Software Engineering (or any non-ABET degree and NCEES credentials evaluation), passing the Fundamentals of Engineering Exam, having at least four years of demonstrably relevant experience, and passing the Software Engineering PE Exam. In some states, such as Florida, Texas, Washington, and other, software developers cannot use the title "engineer" unless they are licensed professional engineers who have passed the PE Exam and possess a valid licence to practice. This license has to be periodically renewed, which is known as continuous education, to ensure engineers are kept up to date with latest techniques and safest practices. [14][15]

The IEEE Computer Society and the ACM, the two main US-based professional organizations of software engineering, publish guides to the profession of software engineering. The IEEE's Guide to the Software Engineering Body of Knowledge - 2004 Version, or SWEBOK, defines the field and describes the knowledge the IEEE expects a practicing software engineer to have. Currently, the SWEBOK v3 is being produced and will likely be released in mid-2013.[16] The IEEE also promulgates a "Software Engineering Code of Ethics".[17]

Employment

In 2004, the U. S. Bureau of Labor Statistics counted 760,840 software engineers holding jobs in the U.S.; in the same time period there were some 1.4 million practitioners employed in the U.S. in all other engineering disciplines combined.[18] Due to its relative newness as a field of study, formal education in software engineering is often taught as part of a computer science curriculum, and many software engineers hold computer science degrees.[19]

Many software engineers work as employees or contractors. Software engineers work with businesses, government agencies (civilian or military), and non-profit organizations. Some software engineers work for themselves as freelancers. Some organizations have specialists to perform each of the tasks in the software development process. Other organizations require software engineers to do many or all of them. In large projects, people may specialize in only one role. In small projects, people may fill several or all roles at the same time. Specializations include: in industry (analysts, architects, developers, testers, technical support, middleware analysts, managers) and in academia (educators, researchers).

Most software engineers and programmers work 40 hours a week, but about 15 percent of software engineers and 11 percent of programmers worked more than 50 hours a week in 2008. Injuries in these occupations are rare. However, like other workers who spend long periods in front of a computer terminal typing at a keyboard, engineers and programmers are susceptible to eyestrain, back discomfort, and hand and wrist problems such as carpal tunnel syndrome.[20]

The field's future looks bright according to Money Magazine and Salary.com, which rated Software Engineer as the best job in the United States in 2006.[21] In 2012, software engineering was again ranked as the best job in the United States, this time by CareerCast.com.[22]

Page 37: Software engineering

Certification

The Software Engineering Institute offers certifications on specific topics like Security, Process improvement and Software architecture.[23] Apple, IBM, Microsoft and other companies also sponsor their own certification examinations. Many IT certification programs are oriented toward specific technologies, and managed by the vendors of these technologies.[24] These certification programs are tailored to the institutions that would employ people who use these technologies.

Broader certification of general software engineering skills is available through various professional societies. As of 2006, the IEEE had certified over 575 software professionals as a Certified Software Development Professional (CSDP).[25] In 2008 they added an entry-level certification known as the Certified Software Development Associate (CSDA).[26] The ACM had a professional certification program in the early 1980s,[citation needed] which was discontinued due to lack of interest. The ACM examined the possibility of professional certification of software engineers in the late 1990s, but eventually decided that such certification was inappropriate for the professional industrial practice of software engineering.[27] In 2012, Validated Guru began offering the Certified Software Developer (VGCSD) certification; which is heavily influenced by the global community.

In the U.K. the British Computer Society has developed a legally recognized professional certification called Chartered IT Professional (CITP), available to fully qualified Members (MBCS). Software engineers may be eligible for membership of the Institution of Engineering and Technology and so qualify for Chartered Engineer status. In Canada the Canadian Information Processing Society has developed a legally recognized professional certification called Information Systems Professional (ISP).[28] In Ontario, Canada, Software Engineers who graduate from a Canadian Engineering Accreditation Board (CEAB) accredited program, successfully complete PEO's (Professional Engineers Ontario) Professional Practice Examination (PPE) and have at least 48 months of acceptable engineering experience are eligible to be licensed through the Professional Engineers Ontario and can become Professional Engineers P.Eng.[29] The PEO does not recognize any online or distance education however; and does not consider Computer Science programs to be equivalent to software engineering programs despite the tremendous overlap between the two. This has sparked controversy and a certification war. It has also held the number of P.Eng holders for the profession exceptionally low. The vast majority of working professionals in the field hold a degree in CS, not SE, and given the difficult certification path for holders of non-SE degrees, most never bother to pursue the license.

Impact of globalization

The initial impact of outsourcing, and the relatively lower cost of international human resources in developing third world countries led to a massive migration of software development activities from corporations in North America and Europe to India and later: China, Russia, and other developing countries. This approach had some flaws, mainly the distance / timezone difference that prevented human interaction between clients and developers, but also the lower quality of the software developed by the outsourcing companies and the massive job transfer. This had a negative impact on many aspects of the software engineering profession. For example, some students in the developed world avoid education related to software engineering because of the

Page 38: Software engineering

fear of offshore outsourcing (importing software products or services from other countries) and of being displaced by foreign visa workers.[30] Although statistics do not currently show a threat to software engineering itself; a related career, computer programming does appear to have been affected.[31][32] Nevertheless, the ability to smartly leverage offshore and near-shore resources via the follow-the-sun workflow has improved the overall operational capability of many organizations.[33] When North Americans are leaving work, Asians are just arriving to work. When Asians are leaving work, Europeans are arriving to work. This provides a continuous ability to have human oversight on business-critical processes 24 hours per day, without paying overtime compensation or disrupting a key human resource, sleep patterns.

While global outsourcing has several advantages, global - and generally distributed - development can run into serious difficulties resulting from the distance between developers. This includes but is not limited to language, communication, cultural or corporate barriers. Handling global development successfully is subject to active research of the software engineering community.

Education

A knowledge of programming is a pre-requisite to becoming a software engineer. In 2004 the IEEE Computer Society produced the SWEBOK, which has been published as ISO/IEC Technical Report 19759:2004, describing the body of knowledge that they believe should be mastered by a graduate software engineer with four years of experience.[34] Many software engineers enter the profession by obtaining a university degree or training at a vocational school. One standard international curriculum for undergraduate software engineering degrees was defined by the CCSE, and updated in 2004.[35] A number of universities have Software Engineering degree programs; as of 2010, there were 244 Campus programs, 70 Online programs, 230 Masters-level programs, 41 Doctorate-level programs, and 69 Certificate-level programs in the United States.[36]

In addition to university education, many companies sponsor internships for students wishing to pursue careers in information technology. These internships can introduce the student to interesting real-world tasks that typical software engineers encounter every day. Similar experience can be gained through military service in software engineering.

Comparison with other disciplines

Major differences between software engineering and other engineering disciplines, according to some researchers, result from the costs of fabrication.[37]

Software Process

A set of activities that leads to the production of a software product is known as software process.[38] Although most of the softwares are custom build, the software engineering market is being gradually shifted towards component based. Computer-aided software engineering (CASE) tools are being used to support the software process activities. However, due to the vast diversity

Page 39: Software engineering

of software processes for different types of products, the effectiveness of CASE tools is limited. There is no ideal approach to software process that has yet been developed. Some fundamental activities, like software specification, design, validation and maintainence are common to all the process activities.[39]

Models

A software process model is an abstraction of software process. These are also called process paradigms. Various general process models are waterfall model, evolutionary development model and component-based software engineering model. These are widely used in current software engineering practice. For large systems, these are used together.[40]

Waterfall model

The waterfall model was one of the first published model for the sofware process. This model divides software processes in various phases. These phases are:[41]

Requirements analysis Software design Unit testing System testing Maintenance

Theoretically the activities should be performed individually but in practice, they often overlap. During the maintenance stage, the software is put into use. During this, additional problems might be discovered and the need of new feature may arise. This may require the software to undergo the previous phases once again.[42]

Subdisciplines

Software engineering can be divided into ten subdisciplines. They are:[1]

Software requirements : The elicitation, analysis, specification, and validation of requirements for software.

Software design : The process of defining the architecture, components, interfaces, and other characteristics of a system or component. It is also defined as the result of that process.

Software construction : The detailed creation of working, meaningful software through a combination of coding, verification, unit testing, integration testing, and debugging.

Software testing : The dynamic verification of the behavior of a program on a finite set of test cases, suitably selected from the usually infinite executions domain, against the expected behavior.

Software maintenance : The totality of activities required to provide cost-effective support to software.

Software configuration management : The identification of the configuration of a system at distinct points in time for the purpose of systematically controlling changes to the

Page 40: Software engineering

configuration, and maintaining the integrity and traceability of the configuration throughout the system life cycle.

Software engineering management: The application of management activities—planning, coordinating, measuring, monitoring, controlling, and reporting—to ensure that the development and maintenance of software is systematic, disciplined, and quantified.

Software engineering process : The definition, implementation, assessment, measurement, management, change, and improvement of the software life cycle process itself.

Software engineering tools and methods: The computer-based tools that are intended to assist the software life cycle processes, see Computer Aided Software Engineering, and the methods which impose structure on the software engineering activity with the goal of making the activity systematic and ultimately more likely to be successful.

Software quality : The degree to which a set of inherent characteristics fulfills requirements.

Related disciplines

Software engineering is a direct subfield of computer science and has some relations with management science. It is also considered a part of overall systems engineering.

Systems engineering

Systems engineers deal primarily with the overall system requirements and design, including hardware and human issues. They are often concerned with partitioning functionality to hardware, software or human operators. Therefore, the output of the systems engineering process serves as an input to the software engineering process.

Computer software engineers

Computer Software Engineers are usually systems level (software engineering, information systems) computer science or software level computer engineering graduates[citation needed]. This term also includes general computer science graduates with a few years of practical on the job experience involving software engineering.

See also

Software portal

Software Testing portal

Main article: Outline of software engineering

Bachelor of Science in Information Technology Bachelor of Software Engineering List of software engineering conferences List of software engineering publications Software craftsmanship

Page 41: Software engineering

EngineeringFrom Wikipedia, the free encyclopediaJump to: navigation, search

The steam engine, a major driver in the Industrial Revolution, underscores the importance of engineering in modern history. This beam engine is on display at the main building of the ETSIIM in Madrid, Spain.

Engineering is the application of scientific, economic, social, and practical knowledge, in order to design, build, and maintain structures, machines, devices, systems, materials and processes. It may encompass using insights to conceive, model and scale an appropriate solution to a problem or objective. The discipline of engineering is extremely broad, and encompasses a range of more specialized fields of engineering, each with a more specific emphasis on particular areas of technology and types of application.

The American Engineers' Council for Professional Development (ECPD, the predecessor of ABET)[1] has defined "engineering" as:

The creative application of scientific principles to design or develop structures, machines, apparatus, or manufacturing processes, or works utilizing them singly or in combination; or to construct or operate the same with full cognizance of their design; or to forecast their behavior under specific operating conditions; all as respects an intended function, economics of operation or safety to life and property.[2][3]

One who practices engineering is called an engineer, and those licensed to do so may have more formal designations such as Professional Engineer, Chartered Engineer, Incorporated Engineer, Ingenieur or European Engineer.

Contents

Page 42: Software engineering

1 History o 1.1 Ancient era o 1.2 Renaissance era o 1.3 Modern era

2 Main branches of engineering 3 Methodology

o 3.1 Problem solving o 3.2 Computer use

4 Social context 5 Relationships with other disciplines

o 5.1 Science o 5.2 Medicine and biology o 5.3 Art o 5.4 Other fields

6 See also 7 References 8 Further reading 9 External links

History

Main article: History of engineering

Engineering has existed since ancient times as humans devised fundamental inventions such as the pulley, lever, and wheel. Each of these inventions is consistent with the modern definition of engineering, exploiting basic mechanical principles to develop useful tools and objects.

The term engineering itself has a much more recent etymology, deriving from the word engineer, which itself dates back to 1325, when an engine'er (literally, one who operates an engine) originally referred to "a constructor of military engines."[4] In this context, now obsolete, an "engine" referred to a military machine, i.e., a mechanical contraption used in war (for example, a catapult). Notable exceptions of the obsolete usage which have survived to the present day are military engineering corps, e.g., the U.S. Army Corps of Engineers.

The word "engine" itself is of even older origin, ultimately deriving from the Latin ingenium (c. 1250), meaning "innate quality, especially mental power, hence a clever invention."[5]

Later, as the design of civilian structures such as bridges and buildings matured as a technical discipline, the term civil engineering [3] entered the lexicon as a way to distinguish between those specializing in the construction of such non-military projects and those involved in the older discipline of military engineering.

Ancient era

Page 43: Software engineering

The Ancient Romans built aqueducts to bring a steady supply of clean fresh water to cities and towns in the empire.

The Pharos of Alexandria, the pyramids in Egypt, the Hanging Gardens of Babylon, the Acropolis and the Parthenon in Greece, the Roman aqueducts, Via Appia and the Colosseum, Teotihuacán and the cities and pyramids of the Mayan, Inca and Aztec Empires, the Great Wall of China, the Brihadeshwara temple of Tanjavur and tombs of India, among many others, stand as a testament to the ingenuity and skill of the ancient civil and military engineers.

The earliest civil engineer known by name is Imhotep.[3] As one of the officials of the Pharaoh, Djosèr, he probably designed and supervised the construction of the Pyramid of Djoser (the Step Pyramid) at Saqqara in Egypt around 2630-2611 BC.[6] He may also have been responsible for the first known use of columns in architecture[citation needed].

Ancient Greece developed machines in both the civilian and military domains. The Antikythera mechanism, the first known mechanical computer,[7][8] and the mechanical inventions of Archimedes are examples of early mechanical engineering. Some of Archimedes' inventions as well as the Antikythera mechanism required sophisticated knowledge of differential gearing or epicyclic gearing, two key principles in machine theory that helped design the gear trains of the Industrial revolution, and are still widely used today in diverse fields such as robotics and automotive engineering.[9]

Chinese, Greek and Roman armies employed complex military machines and inventions such as artillery which was developed by the Greeks around the 4th century B.C.,[10] the trireme, the ballista and the catapult. In the Middle Ages, the Trebuchet was developed.

Renaissance era

The first electrical engineer is considered to be William Gilbert, with his 1600 publication of De Magnete, who coined the term "electricity".[11]

The first steam engine was built in 1698 by mechanical engineer Thomas Savery.[12] The development of this device gave rise to the industrial revolution in the coming decades, allowing for the beginnings of mass production.

Page 44: Software engineering

With the rise of engineering as a profession in the 18th century, the term became more narrowly applied to fields in which mathematics and science were applied to these ends. Similarly, in addition to military and civil engineering the fields then known as the mechanic arts became incorporated into engineering.

Modern era

The International Space Station represents a modern engineering challenge from many disciplines.

Electrical engineering can trace its origins in the experiments of Alessandro Volta in the 1800s, the experiments of Michael Faraday, Georg Ohm and others and the invention of the electric motor in 1872. The work of James Maxwell and Heinrich Hertz in the late 19th century gave rise to the field of electronics. The later inventions of the vacuum tube and the transistor further accelerated the development of electronics to such an extent that electrical and electronics engineers currently outnumber their colleagues of any other engineering specialty.[3]

The inventions of Thomas Savery and the Scottish engineer James Watt gave rise to modern mechanical engineering. The development of specialized machines and their maintenance tools during the industrial revolution led to the rapid growth of mechanical engineering both in its birthplace Britain and abroad.[3]

John Smeaton was the first self-proclaimed civil engineer, and often regarded as the "father" of civil engineering. He was an English civil engineer responsible for the design of bridges, canals, harbours and lighthouses. He was also a capable mechanical engineer and an eminent physicist. Smeaton designed the third Eddystone Lighthouse (1755–59) where he pioneered the use of 'hydraulic lime' (a form of mortar which will set under water) and developed a technique involving dovetailed blocks of granite in the building of the lighthouse. His lighthouse remained in use until 1877 and was dismantled and partially rebuilt at Plymouth Hoe where it is known as Smeaton's Tower. He is important in the history, rediscovery of, and development of modern cement, because he identified the compositional requirements needed to obtain "hydraulicity" in lime; work which led ultimately to the invention of Portland cement.

Chemical engineering, like its counterpart mechanical engineering, developed in the nineteenth century during the Industrial Revolution.[3] Industrial scale manufacturing demanded new materials and new processes and by 1880 the need for large scale production of chemicals was such that a new industry was created, dedicated to the development and large scale

Page 45: Software engineering

manufacturing of chemicals in new industrial plants.[3] The role of the chemical engineer was the design of these chemical plants and processes.[3]

Aeronautical engineering deals with aircraft design while aerospace engineering is a more modern term that expands the reach of the discipline by including spacecraft design.[13] Its origins can be traced back to the aviation pioneers around the start of the 20th century although the work of Sir George Cayley has recently been dated as being from the last decade of the 18th century. Early knowledge of aeronautical engineering was largely empirical with some concepts and skills imported from other branches of engineering.[14]

The first PhD in engineering (technically, applied science and engineering) awarded in the United States went to Willard Gibbs at Yale University in 1863; it was also the second PhD awarded in science in the U.S.[15]

Only a decade after the successful flights by the Wright brothers, there was extensive development of aeronautical engineering through development of military aircraft that were used in World War I . Meanwhile, research to provide fundamental background science continued by combining theoretical physics with experiments.

In 1990, with the rise of computer technology, the first search engine was built by computer engineer Alan Emtage.

Main branches of engineering

Main article: List of engineering branches

Hoover dam

Engineering, much like other science, is a broad discipline which is often broken down into several sub-disciplines. These disciplines concern themselves with differing areas of engineering work. Although initially an engineer will usually be trained in a specific discipline, throughout an engineer's career the engineer may become multi-disciplined, having worked in several of the outlined areas. Engineering is often characterized as having four main branches:[16][17]

Chemical engineering – The application of physics, chemistry, biology, and engineering principles in order to carry out chemical processes on a commercial scale.

Page 46: Software engineering

Civil engineering – The design and construction of public and private works, such as infrastructure (airports, roads, railways, water supply and treatment etc.), bridges, dams, and buildings.

Electrical engineering – The design and study of various electrical and electronic systems, such as electrical circuits, generators, motors, electromagnetic/electromechanical devices, electronic devices, electronic circuits, optical fibers, optoelectronic devices, computer systems, telecommunications, instrumentation, controls, and electronics.

Mechanical engineering – The design of physical or mechanical systems, such as power and energy systems, aerospace/aircraft products, weapon systems, transportation products engines, compressors, powertrains, kinematic chains, vacuum technology, and vibration isolation equipment.

Beyond these four, sources vary on other main branches. Historically, naval engineering and mining engineering were major branches. Modern fields sometimes included as major branches include aerospace, computer, petroleum, systems, audio, software, architectural, biosystems, biomedical,[18] industrial, materials,[19] and nuclear [20] engineering.[citation needed]

New specialties sometimes combine with the traditional fields and form new branches - for example Earth Systems Engineering and Management involves a wide range of subject areas including anthropology, engineering, environmental science, ethics and philosophy. A new or emerging area of application will commonly be defined temporarily as a permutation or subset of existing disciplines; there is often gray area as to when a given sub-field becomes large and/or prominent enough to warrant classification as a new "branch." One key indicator of such emergence is when major universities start establishing departments and programs in the new field.

For each of these fields there exists considerable overlap, especially in the areas of the application of sciences to their disciplines such as physics, chemistry and mathematics.

Methodology

Page 47: Software engineering

Design of a turbine requires collaboration of engineers from many fields, as the system involves mechanical, electro-magnetic and chemical processes. The blades, rotor and stator as well as the steam cycle all need to be carefully designed and optimized.

Engineers apply mathematics and sciences such as physics to find suitable solutions to problems or to make improvements to the status quo. More than ever, engineers are now required to have knowledge of relevant sciences for their design projects. As a result, they may keep on learning new material throughout their career.

If multiple options exist, engineers weigh different design choices on their merits and choose the solution that best matches the requirements. The crucial and unique task of the engineer is to identify, understand, and interpret the constraints on a design in order to produce a successful result. It is usually not enough to build a technically successful product; it must also meet further requirements.

Constraints may include available resources, physical, imaginative or technical limitations, flexibility for future modifications and additions, and other factors, such as requirements for cost, safety, marketability, productibility, and serviceability. By understanding the constraints, engineers derive specifications for the limits within which a viable object or system may be produced and operated.

Problem solving

Engineers use their knowledge of science, mathematics, logic, economics, and appropriate experience or tacit knowledge to find suitable solutions to a problem. Creating an appropriate mathematical model of a problem allows them to analyze it (sometimes definitively), and to test potential solutions.

Usually multiple reasonable solutions exist, so engineers must evaluate the different design choices on their merits and choose the solution that best meets their requirements. Genrich Altshuller, after gathering statistics on a large number of patents, suggested that compromises are

Page 48: Software engineering

at the heart of "low-level" engineering designs, while at a higher level the best design is one which eliminates the core contradiction causing the problem.

Engineers typically attempt to predict how well their designs will perform to their specifications prior to full-scale production. They use, among other things: prototypes, scale models, simulations, destructive tests, nondestructive tests, and stress tests. Testing ensures that products will perform as expected.

Engineers take on the responsibility of producing designs that will perform as well as expected and will not cause unintended harm to the public at large. Engineers typically include a factor of safety in their designs to reduce the risk of unexpected failure. However, the greater the safety factor, the less efficient the design may be.

The study of failed products is known as forensic engineering, and can help the product designer in evaluating his or her design in the light of real conditions. The discipline is of greatest value after disasters, such as bridge collapses, when careful analysis is needed to establish the cause or causes of the failure.

Computer use

A computer simulation of high velocity air flow around the Space Shuttle during re-entry. Solutions to the flow require modelling of the combined effects of the fluid flow and heat equations.

As with all modern scientific and technological endeavors, computers and software play an increasingly important role. As well as the typical business application software there are a number of computer aided applications (Computer-aided technologies) specifically for engineering. Computers can be used to generate models of fundamental physical processes, which can be solved using numerical methods.

One of the most widely used tools in the profession is computer-aided design (CAD) software like Autodesk Inventor, DSS Solidworks, or PRO Engineer which enables engineers to create 3D models, 2D drawings, and schematics of their designs. CAD together with Digital mockup (DMU) and CAE software such as finite element method analysis or analytic element method allows engineers to create models of designs that can be analyzed without having to make expensive and time-consuming physical prototypes.

Page 49: Software engineering

These allow products and components to be checked for flaws; assess fit and assembly; study ergonomics; and to analyze static and dynamic characteristics of systems such as stresses, temperatures, electromagnetic emissions, electrical currents and voltages, digital logic levels, fluid flows, and kinematics. Access and distribution of all this information is generally organized with the use of Product Data Management software.[21]

There are also many tools to support specific engineering tasks such as computer-aided manufacture (CAM) software to generate CNC machining instructions; Manufacturing Process Management software for production engineering; EDA for printed circuit board (PCB) and circuit schematics for electronic engineers; MRO applications for maintenance management; and AEC software for civil engineering.

In recent years the use of computer software to aid the development of goods has collectively come to be known as Product Lifecycle Management (PLM).[22]

Social context

This section may contain original research. Please improve it by verifying the claims made and adding references. Statements consisting only of original research may be removed. (July 2010)

Engineering is a subject that ranges from large collaborations to small individual projects. Almost all engineering projects are beholden to some sort of financing agency: a company, a set of investors, or a government. The few types of engineering that are minimally constrained by such issues are pro bono engineering and open design engineering.

By its very nature engineering is bound up with society and human behavior. Every product or construction used by modern society will have been influenced by engineering design. Engineering design is a very powerful tool to make changes to environment, society and economies, and its application brings with it a great responsibility. Many engineering societies have established codes of practice and codes of ethics to guide members and inform the public at large.

Engineering projects can be subject to controversy. Examples from different engineering disciplines include the development of nuclear weapons, the Three Gorges Dam, the design and use of Sport utility vehicles and the extraction of oil. In response, some western engineering companies have enacted serious corporate and social responsibility policies.

Engineering is a key driver of human development.[23] Sub-Saharan Africa in particular has a very small engineering capacity which results in many African nations being unable to develop crucial infrastructure without outside aid. The attainment of many of the Millennium Development Goals requires the achievement of sufficient engineering capacity to develop infrastructure and sustainable technological development.[24]

Page 50: Software engineering

All overseas development and relief NGOs make considerable use of engineers to apply solutions in disaster and development scenarios. A number of charitable organizations aim to use engineering directly for the good of mankind:

Engineers Without Borders Engineers Against Poverty Registered Engineers for Disaster Relief Engineers for a Sustainable World Engineering for Change Engineering Ministries International[25]

en·gi·neer·ing  ( nj -nîr ng)n.1. a. The application of scientific and mathematical principles to practical ends such as the design, manufacture, and operation of efficient and economical structures, machines, processes, and systems.b. The profession of or the work performed by an engineer.2. Skillful maneuvering or direction: geopolitical engineering; social engineering.

The American Heritage® Dictionary of the English Language, Fourth Edition copyright ©2000 by Houghton Mifflin Company. Updated in 2009. Published by Houghton Mifflin Company. All rights reserved.

Ads by GoogleDictionary Free DownloadWord Definitions, Translate & More. Download Dictionary Boss Today!www.DictionaryBoss.com

engineering [ˌɛndʒɪˈnɪərɪŋ]n(Business / Professions) the profession of applying scientific principles to the design, construction, and maintenance of engines, cars, machines, etc. (mechanical engineering), buildings, bridges, roads, etc. (civil engineering), electrical machines and communication systems (electrical engineering), chemical plant and machinery (chemical engineering), or aircraft (aeronautical engineering) See also military engineering

Page 51: Software engineering

Collins English Dictionary – Complete and Unabridged © HarperCollins Publishers 1991, 1994, 1998, 2000, 2003

Ads by GoogleProjectRegister to get 50 Lakhs Tenders To get Tender leads call:9374530073www.tendertiger.com

engineering  ( n j -nîr ng)The application of science to practical uses such as the design of structures, machines, and systems. Engineering has many specialities such as civil engineering, chemical engineering, and mechanical engineering.