• Non ci sono risultati.

Introducing Kimeme, a Novel Platform for Multi-disciplinary Multi-objective Optimization

N/A
N/A
Protected

Academic year: 2021

Condividi "Introducing Kimeme, a Novel Platform for Multi-disciplinary Multi-objective Optimization"

Copied!
13
0
0

Testo completo

(1)

Introducing Kimeme, a Novel Platform for

Multi-Disciplinary Multi-Objective Optimization

Giovanni Iacca and Ernesto Mininno

Cyber Dyne S.r.l.

Via Scipione Crisanzio 119, 70123 Bari, Italy {giovanni.iacca,ernesto.mininno}@cyberdyne.it

Abstract. Optimization processes are an essential element in many practical applications, such as in engineering, chemistry, logistic, finance, etc. To fill the knowledge gap between practitioners and optimization ex-perts, we developed Kimeme, a new flexible platform for multi-disciplinary optimization. A peculiar feature of Kimeme is that it can be used both for problem and algorithm design. It includes a rich graphical environment, a comprehensive set of post-processing tools, and an open-source library of state-of-the-art single and multi-objective optimization algorithms. In a memetic fashion, algorithms are decomposed into operators, so that users can easily create new optimization methods, just combining built-in operators or creatbuilt-ing new ones. Similarly, the optimization process is described according to a data-flow logic, so that it can be seamlessly integrated with external software such as Computed Aided Design & Engineering (CAD/CAE) packages, Matlab, spreadsheets, etc. Finally, Kimeme provides a native distributed computing framework, which al-lows parallel computations on clusters and heterogeneous LANs. Case studies from industry show that Kimeme can be effortlessly applied to industrial optimization problems, producing robust results also in com-parison with other platforms on the market.

Keywords: Multi-disciplinary optimization, software design, graphical inter-face, distributed computing.

1

Introduction

Over the past two decades, Computational Intelligence Optimization (CIO) has become a popular topic among computer scientists and practitioners. The reason for this success is manifold. First of all, industrial and societal problems have become (and still are becoming) ever more challenging, thus requiring robust solvers and algorithms. Despite the advancements in exact methods, e.g. based on classical mathematical optimization, and related tools, such as CPLEX [1], which can efficiently solve many classes of optimization problems, there are still problems whose scale and complexity may hinder their use, including examples of large-scale problems, dynamic problems, problems affected by noise, or black-box problems for which a mathematical formulation is not even available [2, 3]. In all

(2)

these cases, where strong guarantees are not really needed (or feasible) but best-effort “good” solutions are enough, heuristic methods (or “meta-heuristics”), such as those offered by CIO (Evolutionary Algorithms, Swarm Intelligence, etc.), are often the only effective solution [4].

Another reason for the success of CIO is that these methods are based on very little assumptions (or none at all) on the problem at hand. These methods are, in fact, black-box, so that virtually any input/output system (i.e. a system where inputs -the problem design variables- are mapped to one or more outputs -the problem metrics to minimize or maximize, or “fitness” in the evolutionary jargon) can be optimized by using them. This property is especially useful for example in many engineering, networking, or logistic problems where an explicit, closed-form mapping between inputs and outputs is not available but is often the output of a domain-specific simulator.

Other reasons for the success of CIO are the fact that meta-heuristics can usually be applied to various problems with minimum coding/engineering ef-fort, and, finally, the fact they are largely available in the literature. A plethora of methods exist nowadays, with different properties of robustness and self-adaptation; still, the family of these algorithms is growing and every year the state-of-the-art in optimization is pushed forward.

Table 1. Some of the most popular optimization software tools. The column “Cus-tomizable” indicates if the software allows to implement new optimization algorithms or customize existing ones.

Software GUI MOO Open-Source Customizable License HEEDS MDO [5] Yes Yes No No Commercial

HyperStudy [6] Yes Yes No Yes Commercial Isight [7] Yes Yes No No Commercial LIONsolver [8] Yes Yes No No Commercial modeFRONTIER [9] Yes Yes No No Commercial MOPS [10] Yes Yes No No Commercial Nexus [11] Yes Yes No No Commercial OpenMDAO [12] No Yes Yes Yes Free

Optimus [13] Yes Yes No No Commercial OptiY [14] Yes Yes No No Commercial SmartDO [15] Yes Yes No No Commercial Xtreme [16] Yes Yes No No Commercial

µGP [17] No Yes Yes Yes Free Kimeme [18] Yes Yes No Yes Commercial

As a consequence of this trend, several software packages have been devel-oped in the last years, which provide off-the-shelf algorithms for solving multi-disciplinary optimization problems. Among these, especially those tools designed for solving multi-objective optimization problems, i.e. problems where multiple conflicting criteria have to be optimized [19], have gained an increasing success and popularity. This is indeed a class of problems that arises in several domains,

(3)

such as engineering [20, 21] or finance [22], and which is therefore extremely relevant in practical problems.

A short list of such tools is reported in Table 1, where GUI and MOO indicate, respectively, if the software has a Graphical User Interface and if it allows for multi-objective optimization1. We also indicate if each tool is open-source and if it allows for customization of the optimization algorithms. We should note that we included in the table only multi-disciplinary optimization tools that are specifically based on CIO methods, while we excluded software based on classic techniques for convex optimization, integer linear/non-linear programming and methods addressing combinatorial optimization only. We also excluded those technical software products that are not devoted specifically to optimization but still may include optimization methods, such as Matlab (which provides the Optimization Toolbox), and other CAD/CAE or multi-physics software, as well as multi-disciplinary tools that provide (as an extra feature) one or more, often domain-specific, optimization techniques (see for instance AVL CAMEO [23]).

Clearly, different commercial and open-source free platforms are character-ized by different features in terms of usability, openness, modularity, scalability, easiness of interfacing with external packages, etc. Nevertheless, many of these products share similar principles and patterns. For instance, among commer-cial software, many tools allow the user to describe the optimization problem graphically (with a data-flow or process-flow approach); typically, they provide a variably rich toolbox for post-processing, and a relatively small set of off-the-shelf optimization algorithms (closed-source, either legacy or from the literature) with limited possibility for parameter tuning or other algorithm modifications. Instead, open-source tools generally lack complete graphical interfaces or ad-vanced post-processing features, thus resulting more difficult to use, at least for practitioners interested in ready-to-use tools; also, they require in general some knowledge about optimization and programming, for instance for writing scripts or markup language files to interface with third-party software; however, due to their openness, these platforms allow a higher level of flexibility (e.g. expert users can develop their own algorithms or modify existing ones relatively easily). From this short summary we can note that, as a general trend, commercial and open-source optimization tools are rather far apart in terms of usability and flexibility (but this distinction, arguably, affects all technical software): on one hand, commercial tools prefer usability over flexibility, as they typically address the needs of users who are not necessarily specialized in programming and optimization (such as mechanical engineers, designers, logistic experts, and other corporate technical figures for which optimization is simply a tool). On the other hand, open-source software usually look at academic users, such as mathematicians and optimization scholars who are, in general, more interested in designing novel algorithms and testing them on benchmark functions, rather than applying existing algorithms to specific real-world problems. At the time being, it is hard to find tools that bridge this gap between these two worlds,

1 A more complete list is available at:

(4)

providing at the same time the robustness and ease of proprietary software and the flexibility of open-source platforms.

Motivated by this idea, we developed a novel commercial multi-disciplinary optimization platform, Kimeme. Similar to other commercial platforms, Kimeme provides, off-the-shelf, a rich graphical interface including the possibility of de-scribing the optimization problem graphically, a large set of post-processing tools, such as plots and statistic analysis, and several CIO methods for both sin-gle and multi-objective optimization. Moreover, it includes a native distributed computing system, which allows for a seamless parallelization of the solution eval-uations on a local network. However, differently from most commercial software, Kimeme adds the possibility of modifying the code of the existing algorithms, reusing it to design novel algorithms, or even integrating algorithms designed from scratch. In this way, Kimeme tries to bridge the gap between academy and industry, in such a way that ever more powerful algorithms proposed in the specialized literature on optimization can be easily implemented and made immediately available to practitioners who may be willing to apply them to complex real-world problems.

In this paper we present the main features of Kimeme, its architecture and the way algorithms are implemented. First, we describe in sections 2.1 and 2.2, respectively, the problem and algorithm design environments, while in 2.3 we introduce briefly some of the post-processing tools available in the platform. Sec-tion 2.4 presents the distributed computing framework integrated in Kimeme, which leverages multiple computers to solve computationally expensive prob-lems. Section 3 presents two case studies to exemplify some possible applications of Kimeme. Finally, in section 4 we give the conclusions of this work.

2

Architecture of Kimeme

The main feature of Kimeme is a rich GUI that assists the user in all the steps of the optimization process, from the problem definition to the results post-processing. In this section, we describe the main components of the Kimeme GUI, namely the problem and algorithm design and the post-processing tools. We also introduce another important feature of the platform, that is its native distributed computing infrastructure.

2.1 Problem Design

In Kimeme, every problem is internally represented as an evaluation tree, i.e. an execution tree where each node concurs (in a data-flow logic) to the evaluation of a single solution to the problem at hand. As shown in Figure 1.a, the tree can be created visually, simply adding nodes from a palette. The palette currently includes several nodes for defining design variables, constraints, objectives, con-stants, as well interface nodes to external software, such as Matlab, Python or Java code, Bash or DOS scripts, etc.

(5)

(a) (b)

Fig. 1. Some screenshots of Kimeme: (a) project design view; (b) solution table view, including a 2D scatter plot (see also Section 2.3).

2.2 Algorithm Design

A key element of Kimeme is the possibility, for the user, to modify existing algorithms provided off-the-shelf by the platform, or implementing new ones. A set of state-of-the-art optimization algorithms is available, both for single-objective problems (such as Differential Evolution (DE) [24], Evolution Strate-gies (ES) [25], Self-adapting Differential Evolution (jDE) [26], and Nelder-Mead Simplex [27]) and multi-objective problems, including Non-dominated Sorting Genetic Algorithm 2 (NSGA-2) [28], Multi-objective Particle Swarm Optimiza-tion (MOPSO) [29], Strength Pareto EvoluOptimiza-tionary Algorithm 2 (SPEA2) [30], Archived Multi-objective Simulated Annealing (AMOSA) [31] and two custom variants of Multi-objective DE (MODE) and ES (MOES).

Kimeme was designed with code re-usability in mind: algorithms are, in fact, fully open-source and structured in such a way that every single component of an algorithm can be reused in another algorithm. Inspired by the modern wave of Memetic Computing [32–36], algorithms are structured in self-contained ele-ments called operators (in the memetic jargon, a generalization of a “meme”), each one performing simple operations on the problem solutions. As shown in Figure 2, the generic algorithm structure in Kimeme consists of a Design of Ex-periment (DoE), which generates the initial candidate solutions for the problem at hand (i.e., the initial “population”), followed by a sequence of Start Operators, Step Operators and Stop Operators. The first ones are executed, only once, at the beginning of the optimization, to perform various initializations needed for the execution of the algorithm. Step Operators represent the core of the algorithm iterations and are executed, repeatedly, until one or more stop condition is met. Finally, the Stop Operators implement various stop conditions and check at the end of each iteration if any of those conditions is met.

The way information is passed among operators is by means of a Solution Set List (SSL), i.e. a structured list of candidate solutions (the latter being in turn a structure containing lists of design variables, constraint and fitness val-ues, as well as extra properties of each solution), as depicted in Figure 3. In

(6)

Fig. 2. Generic algorithm flow-chart in Kimeme.

Fig. 3. Structure of the Solution Set List.

this framework, each operator accepts as input a Solution Set List (generated by another operator), and returns as output a new Solution Set List, modified according to its internal operator logics. This structure has two main advan-tages: on one hand, the use of a structured list of solutions allows an immediate implementation of various modern schemes where multiple populations (com-posing a “meta-population”) co-evolve in parallel, for instance according to an island model, and eventually exchange solutions through some sort of migra-tion mechanisms [37]. Also, this structure is especially useful in multi-objective optimization, where it can be needed to rank the solutions according to their non-domination, thus maintaining different solutions sets, one for each rank.

Kimeme comes with a rich library containing several operators for DoE, crossover, mutation, selection, constraint handling, solution niching and rank-ing, various utility functions for manipulating the Solution Set List (e.g. merg-ing, cuttmerg-ing, splittmerg-ing, copymerg-ing, etc.), different stop conditions, and a number of algorithm-specific operations. Each operator defines one or more specific param-eters and additional properties that are needed by the algorithm, and that can also be passed from one operator to another. A special operator, called Evalu-ation Operator, performs the actual evaluEvalu-ation of the incoming SSL, executing multiple instances of the evaluation tree (one for each solution in the SSL) de-fined in the problem design. Also, this operator is responsible for distributing

(7)

the computation either locally, on multiple threads, or remotely, through the Kimeme Network (see Section 2.4). Apart from the Evaluation Operator, all op-erators available in Kimeme are open-source and can be dissected, modified and adapted to the users’ needs. Operators can be easily implemented in an object-oriented fashion, either in Python or Java (through an Integrated Development Environment within the Kimeme GUI) and interpreted or compiled dynami-cally at runtime. A complete Java/Python Application Programming Interface (API) is provided to help the users programming operators. The structure of an algorithm (i.e. the sequence of its operators, as well as their parameters and properties) is simply defined by an xml file, which can be edited either manually or using the Kimeme GUI. The latter assists the user in various operations such as changing the order of the operators by drag-and-drop, removing or adding new operators, defining their parameters and properties etc. A checker is also provided to automatically verify that there are no errors in the operators’ code or inconsistencies in the logics of the algorithm.

2.3 Post Processing

To analyze the results of the optimization process, Kimeme provides a rich set of plots, statistical analyses and post-processing tools. Some examples of such tools are shown in Figure 4, including for instance various scatter plots 2D and 3D, and other multi-dimensional visualization plots such as matrix and parallel plots. The typical post-processing use case involves the selection of one or more solutions from the main solution tables (for example those belonging to the Pareto front), and the choice of the desired plots, see Figure 1.b for an example. In general, multiple plots can be associated to a single table, providing dif-ferent levels of information about the solutions generated by the optimization process (for instance their distribution in the search space, the correlation among variables, etc.). The user can easily interact with the plots, for instance zooming in/out, rotating the 3D views, checking a solution and visualizing its details, etc. The plot graphical details (line width, colors, markers, etc.) can also be edited, independently for each plot. Additionally, solutions data and plots can be exported to ASCII, Excel and Matlab files, and various image formats.

2.4 Kimeme Network

We conclude this section with a brief description of the distributed computing feature available in Kimeme, called Kimeme Network. Distributed computing is particularly relevant in expensive optimization, for instance in many engineering design problems where each solution evaluation corresponds to a computationally heavy multi-physics simulation. In such scenarios, running an optimization algo-rithm (which typically requires hundreds of thousands evaluations) on a single computer might introduce a bottleneck in the business processes. To cut design cost and time, exploiting the inherent parallel nature of most meta-heuristics, distributed computing is therefore needed.

(8)

(a) (b) (c)

(d) (e) (f)

(g) (h) (i)

Fig. 4. Some of the post-processing plots available in Kimeme: (a) 3D bubble plot; (b) generation plot; (c) matrix plot; (d) box plot; (e) parallel plot; (f) PCA plot; (g) 2D scatter plot; (h) 2D scatter plot with probability density function; (i) 3D scatter plot.

The main architecture of the Kimeme Network is shown in Figure 5: a central Java daemon, called Dispatcher, receives requests for computation (i.e., individ-ual solution evaluations) from one or more optimization processes instantiated by (different instances of) the Kimeme GUI. The Dispatcher then distributes the computations, according to various scheduling rules, on a list of available computing nodes (that can be modified at runtime), which run another Java daemon called Worker. Each Worker defines how many CPUs it has available and how many threads it can launch. Both daemons are platform-independent, so that it is possible to set up a heterogeneous distributed computing network

(9)

infrastructure, for instance including Windows/Linux desktops or laptops, next to a cluster and a dedicated high-computing machine.

Fig. 5. Kimeme Network architecture.

3

Case studies

We describe here two case studies of application of Kimeme, to exemplify its applicability to engineering and industrial problems. The first case study is a classic structural engineering design problem, shown here for illustration purpose only. The second application is a complex multi-objective metallurgical problem related to the optimization of productivity and CO2 emission of a blast furnace.

3.1 Optimal Design of a Cantilever

This problem consists in designing an aluminum cantilever having a fixed length (l ) and a fixed force (F ) applied on the free edge, as shown in Figure 6.a. The goal of the design is to keep the structure light-weighted and rigid or, formally speaking, minimizing both the mass (m) and the deflection (w ). Decision vari-ables are the outer (a) and the inner (b) edge lengths. The design has to comply with some dimensional and functional constraints, related to the feasibility of the structure (the outer edge length must be grater than the inner one) and to the deflection-length ratio, which must be less than 10%. A complete formulation of the problem is available in [38].

Figure 6.b shows the Pareto front generated by the MODE algorithm avail-able in Kimeme. We can observe a nice spread of solutions over the Pareto front, which is covered entirely, and quite evenly.

(10)

(a) (b)

Fig. 6. Cantilever design problem: (a) problem description and (b) solutions obtained in Kimeme using MODE with the standard parameter setting (population size 100, 250 generations): initial generation (red), last generation (green), Pareto front at the last generation (black dots).

3.2 Blast Furnace Productivity/CO2 Emission Optimization

This problem consists in minimizing the CO2 emission of an industrial iron blast furnace while simultaneously maximizing its productivity, with a constraint on the Silicon content of the hot metal produced by the furnace. This case study was developed at the Department of Metallurgical and Materials Engineering at the Indian Institute of Technology, and its results have been originally published in a recent paper by Jha et al. [39]. In this work, the authors performed a sys-tematic comparison between various algorithms available in Kimeme and mode-FRONTIER [9]. They also compared these algorithms with a custom-developed surrogate-assisted algorithm called Evolutionary Neural Network (EvoNN) [40]. In the study, different levels of Silicon were considered. Here, we report the pre-liminary results of a smaller subset of algorithms (without tuning, unpublished data) obtained on only two configurations (low and medium, respectively with 0.40-0.55% and 0.55-0.70% Si). We refer the interested reader to the work by Jha et al. [39] for further details about the problem formulation and for complete optimization results with tuned algorithms.

Figure 7 shows the Pareto fronts found by five algorithms from Kimeme (NSGA-2, MOPSO, MODE, MOES and SPEA2), one from modeFRONTIER (NSGA-2), and EvoNN, in low and medium Silicon level. It should be noted that, among all the methods shown in the figure, only EvoNN is surrogate-assisted, therefore a fair direct comparison between this method and the other methods is not possible. It can be observed (as also reported in [39]) that all the methods implemented in Kimeme guarantee a rather good solution spread over the Pareto front. Compared with the implementation of NSGA-2 in modeFRONTIER, all the algorithms in Kimeme perform quite well, especially MOPSO and MODE that consistently show good results at all Silicon levels.

(11)

(a) (b)

Fig. 7. Pareto fronts obtained on the furnace optimization problem described in [39]: (a) low and (b) medium Silicon level. mF NSGA2 indicates the NSGA-2 implementation in modeFRONTIER, while kim NSGA2, kim PSO, kim DE, kim ES, kim SPEA2 indicate, re-spectively, the open implementation of NSGA-2, MOPSO, MODE, MOES and SPEA2 in Kimeme. Lastly, EvoNN indicates the surrogate-assisted method proposed in [40]. Courtesy of Nirupam Chakraborti.

4

Conclusions

In this paper we have introduced Kimeme, an innovative platform for design-ing novel optimization algorithms and applydesign-ing them to real-world problems. The core idea of Kimeme is to bridge the gap between computer scientists and practitioners, so to foster a mutually beneficial transfer of knowledge be-tween academy and industry. The platform offers a configurable environment for both designing algorithms and solving optimization problems, that can be effortlessly tailored to specific applications from various domains. In addition to that, Kimeme provides a very flexible, scalable and easy-to-use distributed com-puting infrastructure, that can be used to speed up the optimization process. Moreover, the set of state-of-the-art algorithms provided in Kimeme is broad enough to obtain good results on optimization problems from different domains: the case studies presented here showed that Kimeme could be easily applied to problems from mechanical design and metallurgy, but the platform is naturally applicable to other domains. Further developments are planned in the years to come in order to enrich the platform with new features, optimization algorithms and post-processing capabilities.

References

1. IBM: CPLEX http://www-01.ibm.com/software/commerce/optimization/ cplex-optimizer/.

2. Tenne, Y., Goh, C.K.: Computational Intelligence in Optimization. Springer (2010) 3. Koziel, S., Yang, X.S.: Computational optimization, methods and algorithms.

(12)

4. Zelinka, I., Snasel, V., Abraham, A.: Handbook of optimization: from classical to modern approach. Volume 38. Springer (2012)

5. Red Cedar Technology: HEEDS MDO http://www.redcedartech.com.R

6. Altair: HyperStudy c http://www.altairhyperworks.com. 7. Dassault Syst`emes: Isight c http://www.3ds.com.

8. LIONlab: LIONsolver http://lionoso.com/.

9. ESTECO: modeFRONTIER http://www.esteco.com/modefrontier.R

10. German Aerospace Center, Institute of System Dynamics and Control, Aircraft Systems Dynamics: MOPS http://www.dlr.de/rm/en/desktopdefault.aspx/ tabid-3842/6343_read-9099/.

11. iChrome: Nexus c http://ichrome.com/solutions/nexus.

12. NASA Glenn Research Center: OpenMDAO http://openmdao.org/.

13. Wilde Analysis Ltd.: Optimus http://wildeanalysis.co.uk/fea/software/R

optimus.

14. OptiY GmbH: OptiY c http://www.optiy.eu/.

15. FEA-Opt Technology: SmartDO c http://www.smartdo.co/. 16. Optimal Computing: Xtreme c http://www.optimalcomputing.be/.

17. Sanchez, E., Schillaci, M., Squillero, G.: Evolutionary Optimization: the µGP toolkit. 1st edn. Springer Publishing Company, Incorporated (2011)

18. Cyber Dyne Srl: Kimeme http://cyberdynesoft.it/.

19. Deb, K.: Multi-objective optimization. In: Search methodologies. Springer (2014) 403–449

20. K¨oppen, M., Schaefer, G., Abraham, A.: Intelligent computational optimization in engineering. Edition Springer (2011) 300–331

21. Yang, X.S., Koziel, S.: Computational optimization and applications in engineering and industry. Volume 359. Springer (2011)

22. Chen, S.H., Wang, P.P.: Computational intelligence in economics and finance. Springer (2004)

23. AVL: AVLTMCAMEO https://www.avl.com/cameo.

24. Storn, R., Price, K.: Differential Evolution - a Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces. Journal of Global Opti-mization 11(TR-95-012) (1997) 341–359

25. Beyer, H.G.: The Theory of Evolution Strategies. Springer-Verlag, Berlin Heidel-berg (2001)

26. Brest, J., Greiner, S., Boˇskovi´c, B., Mernik, M., ˇZumer, V.: Self-Adapting Control Parameters in Differential Evolution: A Comparative Study on Numerical Bench-mark Problems. IEEE Transactions on Evolutionary Computation 10(6) (2006) 646–657

27. Nelder, A., Mead, R.: A simplex method for function optimization. Computation Journal Vol 7 (1965) 308–313

28. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation 6(2) (2002) 182–197

29. Coello Coello, C.A., Lechuga, M.: MOPSO: a proposal for multiple objective particle swarm optimization. In: Evolutionary Computation, 2002. CEC ’02. Pro-ceedings of the 2002 Congress on. Volume 2. (2002) 1051–1056

30. Zitzler, E., Laumanns, M., Thiele, L., Zitzler, E., Zitzler, E., Thiele, L., Thiele, L.: SPEA2: Improving the strength Pareto evolutionary algorithm (2001)

31. Bandyopadhyay, S., Saha, S., Maulik, U., Deb, K.: A simulated annealing-based multiobjective optimization algorithm: AMOSA. Evolutionary Computa-tion, IEEE Transactions on 12(3) (2008) 269–283

(13)

32. Iacca, G., Neri, F., Mininno, E., Ong, Y.S., Lim, M.H.: Ockham’s Razor in Memetic Computing: Three Stage Optimal Memetic Exploration. Information Sciences 188 (2012) 17–43

33. Caraffini, F., Neri, F., Iacca, G., Mol, A.: Parallel memetic structures. Information Sciences 227(0) (2013) 60 – 82

34. Iacca, G., Caraffini, F., Neri, F.: Memory-saving memetic computing for path-following mobile robots. Applied Soft Computing 13(4) (2013) 2003–2016 35. Neri, F., Cotta, C., Moscato, P.: Handbook of Memetic Algorithms. Volume 379

of Studies in Computational Intelligence. Springer (2011)

36. Caraffini, F., Iacca, G., Neri, F., Mininno, E.: The importance of being structured: a comparative study on multi stage memetic approaches. In: Computational In-telligence (UKCI), 2012 12th UK Workshop on, IEEE (2012) 1–8

37. M¨uhlenbein, H.: Parallel genetic algorithms, population genetics and combinatorial optimization. In: Parallelism, Learning, Evolution. Springer (1991) 398–406 38. Cyber Dyne Srl: Kimeme Quick Guide

39. Jha, R., Sen, P.K., Chakraborti, N.: Multi-objective genetic algorithms and genetic programming models for minimizing input carbon rates in a blast furnace compared with a conventional analytic approach. Steel Research International 85(2) (2014) 219–232

40. Pettersson, F., Chakraborti, N., Sax´en, H.: A genetic algorithms based multi-objective neural net applied to noisy blast furnace data. Applied Soft Computing 7(1) (2007) 387–397

Riferimenti

Documenti correlati

We provide evidence that patients with microalbuminuria have a 42% increased risk of progression to overt nephropathy versus those with nor- moalbuminuria, even after adjustment

PS 008 Microvascular complications and diabetic ketoacidosis PS 009 Genetics and novel risk factors for type 1 diabetes PS 010 Novel biomarkers. PS 011 Gene-diet interactions

Journal of Cardiovascular Surgery; Dec 2002; 43, 6; ProQuest Medical

From the optimization result using Multi-Objective Multi-Verse Optimizer (MOMVO), we can conclude that the optimum exergy value is at the intensity 1292.5 cd and compressor

verde della città storica e consolidata, filari e alberature stradali) interrelati e gerarchicamente organizzati: si distinguono infatti componenti primarie (aree a più alta

The following main topics were recognized: (a) impact of climate change on functional traits; (b) response of functional traits to forest management and eutrophication; (c)

Most traditional GA approaches to single optimization problems use overall fitness in survival and parent selection, instead of fitness based on the solutions component parts..

The quality of the dotted representation of the efficient frontier obtained by using the proposed version of the NNCmethod with 20 solutions is compared to that provided by