From a technical perspective, industry faces a large number of technology issues. FENET addresses these under the 3 broad Technology categories -
plus an activity dealing with Education and Dissemination . These are discussed further in the relevant subsections.
Barriers to the uptake and effective use of finite element technology can be grouped into two categories
In the industry-focused discussions, education and training features highly in the wish list for improvement. There are, of course, numerous initiatives aimed at meeting this deficiency, ranging from academic short courses to code specific training courses, from web-based material to traditional text books. This diversity is both a strength and weakness: Industry tends to find the academic material too theoretical, too diffuse and insufficiently focused on industrial issues. Code specific training material can be too parochial and lacking in objectivity. In particular it tends to be skewed to what “can be done” – industry needs also to know what does not work. At the same time academics tend to be concerned at the apparent desire to de-skill the analysis process by “procedurising” it and by attempts to remove all mathematics from user manuals etc.
Against this background there is an ongoing need to discuss and debate the real needs of industrial users and how, and in what manner, they should be met. This will then stimulate the production of material to satisfy the needs of industry and encourage more effective forms of delivery. For example initial discussions have identified a number of projects addressing the issue of training through emerging techniques such as web based learning. The web-based survey of finite element users identified ways of capturing and re-using experience as a high priority, and there is perhaps scope to revisit knowledge management and expert systems technology.
There is also a requirement to examine the need for certification of engineers and accreditation of courses to a uniform standard. It is envisaged that this would contribute to improving the quality of analysis being carried out and enhancing the confidence levels in the use of technology. There is much to be said for a pan European “standard” for short courses on Finite Elements covering many aspects of FE modelling, e.g. linear, non-linear, dynamic analysis, fracture mechanics, fluid mechanics, heat transfer, etc. The feasibility of devising a written examination to assess students for a “Certificate of Competence” in the relevant FE area might also be considered.
There are a number of exploitation and dissemination issues that currently limit the effective uptake of analysis and simulation technology; two of the most important ones being Q A and IPR. Quality Assurance in the broadest sense is about ensuring “fitness for purpose” of the analysis results and embraces the still maturing concepts of “verification” and “validation”. There have been a number of attempts to formalise these concepts, involving rigorous definitions and procedures, in the belief that these can then be integrated with the quality procedures for other business processes. A difficulty is that nonpractitioners tend to want validation (and verification) to be an absolute process with a “black and white” answer: A set of results (or even the model!) is either valid or is not and no qualification is needed. Practitioners, on the other hand, tend to want to qualify their validation by highlighting all the uncertainties and assumptions that go into the simulation. Against this background current QA philosophies for simulation fit somewhat uncomfortably with the more formalised procedures of other business processes.
A further issue, which is likely to become more important as simulation is integrated more closely with product development and other business processes, is how Intellectual Property Rights can be preserved. Conflicts can arise between the desire to standardise and the desire to maintain competitive advantage by asserting ownership of a process, data etc.
Again a forum to debate the nuances of the issue is needed.
Summary of the Project Findings relating to Education & Dissemination
(as presented at the project review meeting in Malta, May 2005) (PDF Format)
D5603 - Procedural Benchmarks for Common Fabrication Details in Plate/Shell Structures
Jim Wood, University of Strathclyde (Hi-Res ZIP Version - 22MB )
WorldWide FEA Survey
Jim Wood, University of Strathclyde
This technology area is driven by the need (and increasingly the ability) to create holistic simulations which couple structural mechanics with fluid, thermal, acoustic, electrical and other descriptions of physical processes. Examples include aerodynamically induced noise and vibration effects in aircraft, metal casting processes, long term ground movements due to thermally and lithostatically induced pore water movements, piezo-electric phenomena, wave–structure interaction effects ranging from simple hydro-dynamic loading on ships to fully integrated kinematic and structural vessel response simulation to stochastically defined sea states.
In this category we also include issues to do with standards for the exchange of data and models between software, hardware and computer architecture advances, multi-processing, and the integration of simulation and CAE methods into the overall business process. In addition it covers improved (more robust) elements, meshless finite element analysis, front end modelling and post processing against the background of a continual demand for improved functionality and performance. New concepts such as stochastic and probabilistic methods also feature here as appropriate.
To represent the behaviour of complex engineering processes mentioned above sufficiently comprehensively, simulation capabilities characterised by the interactions amongst continuum phenomena at the macro-scale - multi-physics, and the impact of behaviour across a range of length and time scales simultaneously - multi-scale. Both are needed. The computational models of closely coupled multi-physics requires the employment of numerical solution procedures that have a measure of compatibility, so that the impact of one phenomena (e.g. electromagnetic fields) can be represented in another (e.g. fluid flow) in an appropriate time and space accurate manner. Moreover, when multi-scale calculations are involved, a variety of domain decomposition techniques are required, which again demands a measure of compatibility amongst the solvers for the phenomena at each of the scales. Even when the multi-level calculations are a realistic aspiration from the perspective of an analyst, then their integration into optimisation tools to facilitate the right first time design for manufacture or performance adds to the software engineering challenge of ensuring that software components for different aspects of the tasks are interoperable.
Multi-physics and multi-scale calculations are very computationally intensive - in an optimisation loop they are even more so. Therefore, the combined simulation optimisation technology targeted as such applications will have to exploit high performance parallel computing systems. Significant efforts will occur over the next few years as the emerging accessibility of these technologies penetrate the manufacturing industry sectors and become more common design tools.
Summary of the Project Findings relating to MultiPhysics & Analysis Technology
(as presented at the project review meeting in Malta, May 2005) (PDF Format)
This covers the need for simulation technology which allows the “system” to be optimised for a wide range of criteria and conditions. It includes for example improved methods of topology and weight optimisation, methods for treating uncertainties more rationally (e.g. reliability-based design optimisation) in addition to the detailed treatment of non-linear effects such as contact, friction, buckling etc. Other areas of interest are large strain effects encountered in modern forming and production processes and many others, impact modelling (including deformational response with large kinematics).
The quest of all engineering processes is to make things better. In the area of Computational Mechanics there have been huge advances in the last 30 years with parallel developments in computers and computational algorithms. Finite Element Analysis has evolved to such a stage of competency that the engineer/physicist can analyse any defined physical situation, linear or non-linear provided the material properties are known.
In the last ten years there has been significant academic research in the area of Structural Optimization to the stage where the algorithms needed for size, shape, topology and topography optimization are becoming more reliable and robust. We are now starting to see some limited commercial uptake of these analytical optimizers replacing the traditional engineering intuitively/heuristic driven iterative design optimization methods.
The eventual goal of all structural optimization systems it to be able to deliver on the design wish list of structural goals such as;
None of the software products currently available deliver this whole list. None of them even address the last item in any realistic way. Currently there are two main computational techniques for structural optimization, mathematical programming with design variables (which can be the presence of an element, rather than a geometric entity) and heuristic methods. Several commercial FEA vendors offer one or both of these capabilities and there are several in-house proprietary codes. There is still much research and development to be done and much training of practicing engineers before Design and Structural Optimization becomes a routine part of the design process. The status at the moment is akin to that of FEA in the 1980’s.
Equally as important, but still significantly lacking, is the integration of manufacturing process models into the design optimization loop. Indeed if we are to be commercially serious for the product under consideration then we should also include financial, marketing, environmental, support and service and retirement into the design optimization.
Each of these activities has different analysis processes and data structures and responsibility resides in different locations in any commercial organization. Even between analysis and manufacturing models there are significant integration problems. This gap becomes even greater when other commercial processes are involved.
The challenge is therefore to guide the development and uptake of these new integrated analytical processes into true design optimization and to provide direction to all parties involved; code developers, researchers, designers and manufactures as to how the Computational Mechanics community should proceed from here.
Summary of the Project Findings relating to Product & System Optimisation
(as presented at the project review meeting in Malta, May 2005) (PDF Format)
D3602 - The use of Design of Experiments (DOE) and Response Surface Analysis(RSA) in PSO (PDF, 1.6Mb)
Prof. Carlo Poloni, Dr. Valentino Pediroda, Dr. Alberto Clarich - University of Trieste, Prof. Grant Steven, University of Durham
D3608a - General Purpose FEA vs Single Purpose Design Optimisation (PDF, 6.5Mb)
Prof. Grant Steven, University of Durham
D3608b - Product and System Optimisation in Engineering Simulation (PDF, 2.3Mb)
Prof. Grant Steven, University of Durham
D3611 - The use of Robust Design and Game Theory in PSO (PDF, 2.3MB)
Prof. Carlo Poloni, Dr. Valentino Pediroda, Dr. Alberto Clarich - University of Trieste
D3614 - The use of Optimisation algorithms in PSO (PDF, 5.2Mb)
Prof. Carlo Poloni, Dr. Valentino Pediroda, Dr. Alberto Clarich - University of Trieste, Prof. Grant Steven, University of Durham
Stay up to date with our technology updates, events, special offers, news, publications and training
If you want to find out more about NAFEMS and how membership can benefit your organisation, please click below.
Joining NAFEMS© NAFEMS Ltd 2024
Developed By Duo Web Design