As reading further it enbroud your mind to no alot of things that goes around and fest you in your position as real engineer.
How can system engineering be applied in business
At the end of the module, I'm realize that 0% of complication and risk is impossible, we can only reduce complications over one project by carefully analyzing future needs of the project.
As more complicated the system be and it complexity could lead to the common risk, that should be focused to the final users as it most coplex it get. it is most probable to decrease it growth because peolpe love a good looking and simple to use environment
In reading further it enbroud your mind to no alot of things that goes around and fest you in your position as real engineer.
There is risk within any project its just minimizing the risk as best as possible by following the life cycle of the project
As more complicated the system be and it complexity could lead to the common risk, that should be focused to the final users as it most coplex it get. it is most probable to decrease it growth because peolpe love a good looking and simple to use environment.
Figure 3 showed five commonly encountered problems of effecting different types of change. These are notionally located on a spectrum of change that ranges from no change at all, to complete revolution. The relationship suggested on the figure is that as the degree of change - represented by the different types of problem - increases so, too, do difficulty and risk. Each of the five problems of effecting change can be regarded as a gap between an existing situation and an alternative, desired or preferable situation.
To close these gaps, be they the correction of faults or the design and implementation of a completely new, innovative system, requires the deployment and consumption of resources. These resources may be a mixture people, materials, equipment, objects, and information. These two characteristics of change - the nature of the change problem and the use of resources - can be used as the basis for developing the picture shown in Figure 3 into a more complete model of change, and taken together represent the certainty of outcome of a change project.
Figure 9 shows change problems divided into three categories; simple, complicated and complex. The two dimensions of the problem of effecting change, knowing what is required and knowing how to achieve what is required are shown as two axes that each run from high to low - from complete knowledge and certainty to ‘haven't a clue’.
Change problems are rendered (relatively) simple by a high degree of knowledge of what needs to be done and how to do it. I have termed the dominant form of knowledge required to address this type of problem as ‘craft knowledge’, since it is the product of a learning that is essentially experiential in character, being a result of meeting and tackling successfully similar problems in the past. This craft knowledge has been termed as ‘tacit’ and may be embedded in individuals or in the organization itself in informal rules and procedures. In such situations certainty of outcome is high.
As uncertainty increases the change problem becomes more complicated and less amenable to solution through the application of tacit knowledge. Knowledge of what is needed or how to achieve what is required, or both, is less certain.
In such situations those involved are likely to fall back on formal knowledge, either of first principles or that which has been embedded in formal rules and procedures. These complicated change tasks are often solved by the application of traditional engineering knowledge.
The third type of change problem shown in Figure 9 has high uncertainty as it is complex. Traditional forms of engineering knowledge no longer suffice and it is in application to this type of problem that systems engineering knowledge comes into its own. This type of knowledge is both ‘systemic’ and ‘systematic’.
It is systemic because, being based on the systems principle of ‘holism’, it views the change problem as a whole, resisting the inclination to see it from the perspective of a particular function or discipline. It is ‘systematic’ because it embodies rational frameworks and approaches that reduce uncertainty.
The following page shows "Figure 9 A model of simplicity, complication and complexity of systems problem related to type of knowledge"
This point gave rise to a fierce debate in the original Course Team. The External Assessor argued strongly that there are other factors than the degree of knowledge that contribute to complexity such as the increased coupling of systems and the strength and breadth of their interaction. These give rise to stronger emergent behaviour. While agreeing with this point, others stated that it was the lack of knowledge and, therefore, the failure to predict these factors that gave rise to increasing risk.
An example of the three types of change problem will help to illustrate the model. Suppose that I want to extend the electrical wiring in my house into the garden to run some lights and a water feature. Designing an extension of this type is well within the capability of a competent electrician, and I am confident that the lights and water feature can be got to work satisfactorily and safely. Craft knowledge of a readily available kind is required to deal successfully with this simple problem.
A more complicated problem is consequent on a decision to install a new security system for the house. This requires more specialist knowledge than a simple extension to existing wiring and is a more difficult design task. I am likely to employ a firm that specializes in this type of project, which may also involve, in supporting roles, other areas of knowledge such as glaziers, plasterers and so on.
The complex level of change problem can be illustrated by the decision of a building company to design a dwelling with a fully automated control system. Since, as yet, only limited prototypes of this type of accommodation have been developed, the requirements and functionality of such a system are uncertain.
Equally vague are the domains of knowledge that would be needed to design and implement such a system successfully. As a result, the outcomes of a project to design a ‘house automation system’ are highly uncertain.
Four observations can be made about the change problem model presented in Figure 9. Uncertainty increases with the degree of turbulence in the environment of the change problem. This turbulence may be associated with change in:
• Underlying technologies, either those embodied in the product or service in question or those that are to do with how to achieve what is required,
• The business or competitive environment,
• The political environment,
• The social environment,
• The economic environment.
The existing knowledge base of individuals and organisations will bias their perception of a problem and how it can be solved.
The organization may suffer from unconscious incompetence, not being aware of what it does not know. These two factors, environmental change and perception of the nature of the problem, increase the degree of uncertainty that is associated with a need for change and, consequentially, with its riskiness.
Risk can be denned as the probability of an unexpected outcome. Naturally enough we have an asymmetrical attitude to unexpectedness. We don't mind positive unexpected outcomes but want to avoid nasty ones and their consequences.
Our tendency is, therefore, to be risk averse, and only if we are offered a greater return for doing so will we take on extra risk:
• Those undertaking dangerous sports are compensated by the psychological return that they provide, or the social status that participation confers,
• Punters on a horse race are offered better odds on outsiders than on the favorite,
• Motor insurance companies want bigger premiums from drivers with a poor claims history,
• Investors in the stock market demand greater returns from shares that are more volatile than the average of the market as a whole.
As a consequence, though it may be tempting to do so, businesses which undertake only safe forms of change, those that fall within their comfort zone, will not do better than the average. Taking on risk is uncomfortable but necessary because it brings with it greater financial returns and increased knowledge and learning. Systems engineering is a way of reducing the inherent riskiness of the new and complex.
In the remainder of this module we will discuss the issues for systems engineering associated with the topics of simplicity, complication, and complexity.
Following the model in Figure 9, these closely related and, in some instances, overlapping topics will be examined in relation to the difficulty that they create for a systems engineer in terms of what he or she has to do, and the way that the work is performed.
The winter of 1665/66 must have been exceptionally harrowing for the inhabitants of England. Along with the winter weather, the country suffered an outbreak of the plague. A minor effect of this was a decision by Trinity College Cambridge to close its doors. One of those affected by this decision was a young Fellow, Isaac Newton, who returned home to spend the winter in the Lincolnshire rectory in which he had been brought up.
Isolated in the bleak fens and without college high table and the conviviality of the other Fellows to distract him, he found himself at a loose end, so the 22-year-old Newton buckled down and during the next 12 months; solved the binomial theorem, invented calculus, discovered the universal law of gravitation, and developed a theory of color.
Eventually the threat of the plague lessened and Newton returned to Trinity, where he was elected Lucasian Professor of Mathematics. The work that he did during the 1665/66 winter became the basis for Philosophiae Naturalis Principia Mathematica (The Mathematical Principles of Natural Philosophy), which was first published in 1687.
The importance of Newton's work cannot be overestimated, and it is no exaggeration to regard 1665/66 as the beginning of the modern world. The mysterious, magical world of the Middle Ages was replaced by one amenable to rational analysis. Explanation based on myth, magic or the unknowable will of a divinity gave way to observation, calculation and the operation of universal laws.
The achievement of Newton, and others who built on his work, was to provide ways of understanding relationships and interactions in the physical, observable world. In doing so they reduced its complexity to mere complication at worst and simplicity at best. As suggested earlier in this unit, simplicity, complication and complexity are closely related to perception, understanding and the existing knowledge base.
If we are faced with a problem that we do not fully understand or one that we cannot see how to solve, we label it ‘complex’.
Effectively, we are saying that there is an unknown area that needs to be explored, and a way of dealing with it established. A close conceptual relation of complexity is complication.
The wristwatch shown below happens to have a glass back through which its mechanism can be viewed, as shown in Figure 10. It's an interesting world inside the watch case, with lots of tiny parts interacting with one another. It is complicated but not complex. There is no ‘unknown’ element in the nature of the outputs of the watch or how the mechanism achieves them. Although personally I couldn't construct a watch, there are plenty of people with the necessary skills. It's a ‘known problem’, albeit a complicated, tricky one.
The watch is an illustration of the physical world of objects governed by Newtonian physics and for which, therefore, we have good explanatory models. There are, however, three other ‘worlds’ for which we do not, as yet, have models of equal stature.
Figure 11 shows our level of explanatory confidence as a function of three worlds - the ‘sub-physical’ world of quantum physics, the world of physical objects governed by Newtonian mechanics and a ‘supra-physical’ world of complex systems and which includes a fourth world of human activity systems. In both the sub- and supra-physical worlds there is considerably less success of explanation and therefore greater inherent complexity.
In 1927, the German particle physicist Walter Heisenberg (1901-1976) put forward the view that at a subatomic level it was possible to determine either the location of a particle or its vector, but not both. In order to study the behaviour of subatomic particles it is necessary to bounce other subatomic particles off them or to get them to collide with other subatomic particles.
Either of these two actions destroys what was happening and so leaves it a mystery. Heisenberg's uncertainty principle states that what happens down in the depths of the subatomic world is unknowable.
In 1968, the German theoretical biologist Ludwig von Bertalanffy published General System Theory (von Bertalanffy, 1968). Although elements of this work had precursors, von Bertalanffy's work was essentially the basis of academic interest in ‘systems’ as a subject. The conceptual basis of systems is discussed in more detail later but one of its cornerstones - emergent properties - is relevant here.
This concept states that the properties and behaviour of a system cannot be deduced from studying the properties or behaviours of its elements in isolation. At one level this principle hardly rises above the banal. Everything, be it a physical object or conceptual system, exhibits properties and behaviours that result from the interaction of its constituent elements and which, therefore, are not to be found in those elements in isolation.
There are, I believe, no exceptions to this statement. This means that the possession of emergent properties cannot be regarded as a distinguishing feature of a system.
However, it is often the case that systems, even simple ones, exhibit behaviours that are unexpected and which surprise their designers, users or observers.
Sometimes these behaviours could have been foreseen, but through oversight or negligence were not considered during the design phase of the system. Of equal interest to these preventable emergent properties are those that could not have been foreseen and which are genuinely unexpected.
There are external and internal reasons for the occurrence of these. This point will be examined in more detail in module 2.
Externally caused emergence occurs when a system reacts to its environment in a way that could not have been predicted. There are two origins of unexpected externally caused emergence.
The system has not been designed to be robust against variation in part of its environment. For example, part-way through the construction of the light-railway system serving London's Docklands area an announcement was made of the massive office development at Canary Wharf. On its own this project completely negated all the carefully calculated predictions of traffic for the new railway. To compound the problem the Canary Wharf announcement was made when the construction of the railway, its rolling stock and associated traffic management systems were all well advanced. It was thought that nothing could be done to accommodate the traffic that the Canary Wharf development was expected to generate but, in the event, the railway has coped remarkably well. Emergent properties do not always mean failure.
The occurrence of a new element in the system's environment. The example of the Docklands Light Railway illustrated an unexpected variation in one of the important parameters used as the basis for the railway's design. Sometimes, however, a new factor will occur in the environment. The more complex the system, the longer ,(all other things being equal), the design, development and implementation processes take and therefore the more likely that unpredictable factors affecting the system will occur in the environment.
The Iridium system was conceived as providing worldwide telecommunications through a network of geostationary satellites. Problems with the launch vehicles and the performance of the satellites themselves delayed the project, which was 12 years in development. In the meantime the interconnection of networks of terrestrial systems had overtaken the concept. Iridium declared itself bankrupt in August 1999 but may be resurrected as a system for specialist communication.
Internally generated emergence occurs when the elements of the system interact with each other in an unpredicted way. A potent source of this type of emergence is created by the behaviour of humans (most often, but other sentient creatures too) within the system. Once again, the Millennium Bridge provides an example of this. If only the people crossing the bridge had not perversely attempted to compensate for its lateral movement everything would have been fine and the ‘blade of light’ would have remained unsullied by dampers and struts. Emergence can also arise from the unforeseen interactions between the elements of the system and its environment.
Because we do not know everything which is salient when that knowledge is required, the often unexpected, unpredictable character of emergence means that it remains mysterious, adding to the difficulties of undertaking a complex systems engineering task.
Invieremo le istruzione per resettare la password al tuo indirizzo mail associato. Inserisci il tuo indirizzo mail corrente