Documentos de Académico
Documentos de Profesional
Documentos de Cultura
Overview
Experimental design (also called Design of Experiment (DOE)) is a collection of procedures and statistical
tools for planning experiments and analyzing the results. In general, the experiments measure the
performance of a physical prototype, the yield of a manufacturing process, or the quality of a finished
product.
Although experimental design techniques were originally developed for physical experiments, they also
work very well with virtual experiments. In the case of Adams/Insight, the experiments help increase the
reliability of your conclusions, get you answers faster than trial-and-error or testing factors one at a time,
and help you better understand and refine the performance of your mechanical system.
For simple design problems, you can explore and optimize the behavior of your mechanical system using
a combination of intuition, trial-and-error, and brute force. As the number of design options increase,
2 Adams/Insight
however, these methods become ineffective in formulating answers quickly and systematically. Varying
just one factor at a time does not give you information about the interactions between factors, and trying
many different factor combinations can require multiple simulations that leave you with a great deal of
output data to evaluate. To help remedy these time-consuming tasks, Adams/Insight provides you with
the planning and analysis tools for running a series of experiments. Adams/Insight also helps you to
determine relevant data to analyze, and automates the entire experimental design process.
Process
The experimental design process includes five basic steps:
• Determine the purpose of the experiment. For example, you might want to identify which
variations most affect your system.
• Choose a set of factors for the system that you are investigating and develop a way to measure
the appropriate system responses.
• Determine the values for each factor (called Levels), and plan a set of experiments (called runs or
trials) in which you vary the factor values from one trial to another. The combination of actual
runs to perform is called the design.
• Execute the runs, recording the performance of the system at each run.
• Analyze the changes in performance across the runs, and determine what factors most affect
your model.
An experiment configured using this process is called a designed experiment, or matrix experiment. The
runs are described by the design matrix, which has a column for each factor and a row for each run. The
matrix entries are the levels for each factor per run.
Experiments with two or three factors might only require five or ten runs. As the number of factors and
levels grows, however, the number of runs can quickly escalate to dozens, even hundreds. As a result, a
good design is critical to the success of the experiment. It should contain as few runs as possible, yet give
enough information to accurately depict the behavior of your system. The best design depends on the
number of factors and levels, the nature of the factors, assumptions about the behavior of the product or
process, and the overall purpose of the experiment. Adams/Insight lets you combine all of these
requirements into an efficient, effective design for your problem, and help you make accurate analyses
of the results.
Analysis
The type of analysis you’ll run depends on the purpose of the experiment. Common analyses include
Analysis of Variance (ANOVA), which determines the relative importance of the factors, and Linear
Regression, which fits an assumed mathematical model to the results.
Adams/Insight ASCII Conduit 3
Example
If a simple experiment includes two factors, each with three Levels and four runs, the design matrix for
the experiment might look like this:
0 1
–1 0
1 –1
1 1
Each row of the matrix represents a run, and each column represents a factor. A -1 indicates the first level
for the factor, a 0 the second, and a +1 the third.
If the levels for the first factor are 9, 10, and 11, and the levels for the second factor are 85, 90, and 95,
then the matrix would give the following runs:
References
DOE
• Grove, D. M., and T. P. Davis. Engineering Quality Experimental Design. ISBN 0-582-06687-5.
• Cornell, John A. How to apply Response Surface Methodology. American Society for Quality
Control Statistics Division, Volume 8. ISBN 0-87389-066-3.
• Myers, Raymond H., and Douglas C. Montgomery. Response Surface Methodology. Wiley Inter
Science. ISBN 0-471-58100-3.
• Box, G. E. P., and D. W. Behnken. Some New Three Level Designs for the Study of Quantitative
Variables. Technometrics, Vol. 2, No. 4, November 1960.
• Gentle, James E., Random Number Generation and Monte Carlo Methods. Springer-Verlag,
1998.
• Introduction to Monte Carlo Methods. http://www.phy.ornl.gov/csep/CSEP/MC/MC.html.
• Numerical Recipes. http://www.nr.com.
• Greenwood, W.H. and Chase, K.W., A New Tolerance Analysis Method for Designers and
Manufacturers, ASME Journal of Engineering for Industry, Vol. 109, pp. 112-116, May 1987.
Python
• http://www.python.org/
• Beazley, David M., Python Essential Reference. ISBN 0-735-70901-7.
• Lutz, Mark. Python Pocket Reference. ISBN 0-596-00189-4.
• To run Python you can run from mdadams2010 environment 'mdadams2010 -c python'
Regression/RSM
• Draper, Norman R., and Harry Smith. Applied Regression Analysis. John Wiley & Sons, 1998.
ISBN 0-471-17082-8.
• Box, George E. P., and Norman Richard Draper. Empirical Model-Building and Response
Surfaces. John Wiley & Sons, 1987. ISBN 0-471-81033-9.
• Montgomery, Douglas C., and Raymond H. Myers. Response Surface Methodology: Process and
Product in Optimization Using Designed Experiments. John Wiley & Sons, 1995. ISBN 0-471-
58100-3.
Statistics/Distributions
• NIST/SEMATECH e-Handbook of Statistical Methods,
http://www.itl.nist.gov/div898/handbook/index.htm
References 5
XML
• Eckstein, Robert, XML Pocket reference. ISBN 1-56592-709-5.
• Harold, Elliotte Rusty, and W. Scott Means, XML in a Nutshell. ISBN 0-596-00058-8.
• http://www.w3schools.com/
6 Adams/Insight
Learning the Basics
2 Adams/Insight
Starting Adams/Insight
Starting Adams/Insight
There are two ways to start Adams/Insight:
• Start Adams/Insight and open an existing experiment.
• Start Adams/View, run a simulation, and then export the results to Adams/Insight.
To start Adams/Insight:
1. Do one of the following:
• On UNIX, type the command to start the Adams Toolbar at the command prompt, and then
press Enter. Select the Adams/Insight tool.
• On Windows, from the Start menu, point to Programs, point to MSC.Software, point to MD
Adams 2010, point to AInsight, and then select Adams - Insight.
The Adams/Insight main window appears.
2. Open your existing experiment.
To start Adams/View:
• Do one of the following:
• On UNIX, type the command to start the Adams Toolbar at the command prompt, and then
press Enter. Select the Adams/View tool.
• On Windows, from the Start menu, point to Programs, point to MSC.Software, point to MD
Adams 2010, point to Aview, and then select Adams - View.
The Adams/View main window appears.
3. In the File to Read text box, enter the name of your command file, or right-click and select
Browse. You can then use the Select File dialog box to open the file.
4. Select OK.
Adams/View imports the file, and then displays the model.
Note: You can also display the Database Navigator by double-clicking in the Model or
Simulation Script text box.
5. Optionally, enter the name of an existing experiment in the Reuse Experiment text box.
You can also browse for the experiment in the Database Navigator. Right-click in the text box,
point to Experiment, and then select Browse. Select the model in the Database Navigator, and
then select OK.
If you enter an experiment to reuse, Adams/Insight reuses as many components of the old
experiment as possible in the new experiment. More about the Reuse Tool.
Note: The new experiment cannot have the same name as the old experiment. Adams/View will
rewrite the new experiment file, erasing any old information in the file. To reuse an old
experiment, use a different name for the new experiment.
4 Adams/Insight
Starting Adams/Insight
6. Select OK.
Adams/View writes your model information to an experiment file and launches Adams/Insight.
The experiment file will have the same name as the experiment, and will be written in the current
Adams/View working directory.
If you reused an old experiment and Adams/Insight is able to use it to create a work space for the
new experiment, Adams/Insight immediately adds the work space to the new experiment file and
returns to Adams/View. Adams/View will then run the experiment.
If you did not reuse an old experiment, or if Adams/Insight is unable to automatically create a work space
using the old experiment, the Adams/Insight main window appears.
On Windows, Adams/View opens a command prompt window to launch Adams/Insight. This window
stays open until you exit Adams/Insight and return to Adams/View. Do not manually close the command
prompt window.
Setting Preferences
You can set preferences that will be used for all Adams/Insight experiments. You can define properties
for design, fit, optimization, and thresholds.
To set preferences:
1. From the Edit, menu select Preferences.
2. Complete the dialog box as described in Preferences.
3. Select OK.
3. Select OK.
6 Adams/Insight
Adams/Insight Main Window
• Menu Bar: The Menu bar contains pull-down menus for File, Edit, Define, Simulation, Tools,
and Help.
• Toolbars: The toolbars contain commonly-used tools for accessing files, creating experiments,
and generating reports. The tools in the toolbars are arranged in the order that you use them in
the process of creating and executing your designed experiment. Depending on where you are in
the process of creating an experiment, Adams/Insight enables or disables the tools. This feature
alerts you to the correct order of procedures to follow.
• Treeview: The treeview displays a hierarchical list of objects that you can include in an
experiment. The tree is especially useful in selecting and identifying objects when you are
creating a design matrix.
• Viewport: The Viewport is the area of the window that displays fields for modifying the objects
you select from the treeview.
• Status bar: The Status bar displays messages and issues prompts during your Adams/Insight
session.
8 Adams/Insight
About the Toolbars
Experiments Toolbar
The Experiments toolbar lets you execute basic commands. It includes the following tools:
Report Toolbar
The report toolbar lets you generate and export a report. It includes the following tools:
Setting Up Toolbars
You can turn the display of the toolbars on or off and set where they appear. By default, all toolbars are
displayed at the top of the window. To set the placement of these objects, see Moving Toolbars.
Learning the Basics 11
About the Toolbars
Experiments
Experiments Contents
Work Space
Report
If you select Line up from the toolbar selection menu, Adams/Insight takes
Line up
all toolbars and aligns them in the top left corner of the window.
Moving Toolbars
You can move the Adams/Insight toolbars to other areas of the screen.
To move a toolbar:
1. Put your cursor on the divider for the toolbar you want to move.
12 Adams/Insight
About the Toolbars
Analysis
You can view the properties of each model in your experiment using the Model Properties form. You can
view statistic categories on the following:
• Regression Summary
• Response Summary
Regression Summary
The regression summary displays a summary of statistics for the entire model. You can view the
following statistics for your model:
• Properties
• Rules Summary
• Goodness of Fit
• Term Significance
• Studentized Residuals
• Cook’s Statistics
• Term coefficients
• Beta (standardized coefficient)
• Residuals
• Estimates
• Minimum and maximum estimates
Response Summary
The response summary displays a summary of statistics for a specific response in your experiment. You
can view the following statistics for the response:
• Fit Table
• Term
• Residuals
• Condition
• Minimum and maximum
• Plot - Raw residuals vs. Estimates
• Plot - Responses vs. Trials
• Plot - Studentized residuals vs. Trials
• Histogram - Raw residuals
Running Experiments 3
Analysis
Designs
Design Specification
The Design Specification form is where you define the details of your experiment. Some details are:
• DOE Design Types
• Investigation Strategy
• Model
Full Factorial
Full Factorial is the most comprehensive of the design types and uses all of the possible combinations of
Levels for your factors. The total number of runs is mn, where m is the number of levels and n is the
number of factors. Since the values for mn increase very quickly, Full Factorial is only practical for an
experiment with few factors.
The Full Factorial algorithm can produce mixed-level designs that have a different number of values for
each factor. Mixed-level designs can occur when you have discrete variables, which take on values from
a fixed list. This contrasts with continuous variables, which take on arbitrary values that are usually
constrained to a range. For example, a mixed-level design might have two Design Variables one with two
levels and one with three levels. The number of runs for such a design is 2 * 3 = 6. In general, to compute
the number of rows in a Full Factorial design, just multiply the number of levels of each design variable.
Fractional Factorial
Fractional Factorial and Plackett-Burman designs are referred to as reduced factorial designs. They are
popular for screening important variables and are used principally with two-level factors. They enable
Design Specification 5
Designs
you to estimate the effects on your system and, depending on the number of factors and the number of
runs, estimate either none, some, or all of the two-factor interactions.
They are appropriate for two-level screening experiments when you are primarily interested in
identifying the most significant factors (main effects) affecting the responses under investigation. As a
subset of Full Factorial, these designs require fewer Trials, but may result in confounding of factor
interactions with main effects. You should use these designs with the Screening method of experimental
design, not RSM. Learn more about Screening and RSM.
These design types let you specify the number of trials in certain conditions. For example, for four factors
and a linear model the only possible number of trials is 8. For five factors and a linear model one would
have 8 or 16 trials.
The number of runs for a Fractional Factorial design must be a power of two (4, 8, 16...).
Plackett-Burman
Plackett-Burman designs are useful for screening a large number of factors to find the most important
ones. These designs require the fewest runs of any classical design type, but do not allow you to estimate
the interactions between factors.
The number of runs for a Plackett-Burman design must be a multiple of four (4, 8, 12, ..., 48).
Box-Behnken
Box-Behnken designs use points on planes of the design space as shown in the diagram below. A Box-
Behnken design requires relatively few trials. For example, a 12-factor design has 192 rows with 12
center points, for a total of 204 trials. Even though the number of trials is low, the results yield
information on factor interaction, which makes these designs appropriate for RSM experiments in which
the model type is quadratic. Box-Behnken designs require using each factor at three levels, and are
available for designs with 3, 4, 5, 6, 7, 9, 10, 11, 12, or 16 factors.
Box-Behnken Design with Three Factors and Three Levels:
6 Adams/Insight
Designs
D-Optimal
The D-Optimal design produces a model that minimizes the uncertainty of coefficients. This design
consists of a random collection of rows from a larger pool of Candidates that are selected using
minimization criteria. D-Optimal designs let you specify the total number of runs in an experiment,
supply existing rows from a previous experiment into a new experiment, and specify a different level for
each factor. These features make D-Optimal designs the best choice in many situations, especially when
experiment cost is a significant consideration.
D-Optimal designs extend to larger design matrices. For example, the more redundant the vectors
(columns) of the design matrix, the closer to zero the determinant of the correlation matrix is for those
vectors; the more independent the columns, the larger the determinant of that matrix is. Therefore,
finding a design that maximizes the determinant D of this matrix means finding a design where the factor
effects are maximally independent of each other.
8 Adams/Insight
Designs
Latin Hypercube
A Latin Hypercube design uses as many values as possible for each factor. Each factor's values are
randomly ordered so that each run has a random combination of factor values.
Continuous factors have a different value for each run. The values are equally spaced, running from the
minimum value to maximum factor value. Discrete factors have a fixed number of values. If there are
more runs than discrete values, there will be runs with duplicate factor values. If there are fewer runs than
discrete values, then not all values will be used.
The Latin Hypercube design is similar to a Sweep Study design, except that the factor values in each
column are randomly ordered instead of uniformly sweeping from the minimum value to maximum
value.
Investigation Strategies
The investigation strategies (methods) for creating a design matrix in Adams/Insight include:
• Study - Perimeter
• Study - Sweep
• DOE Screening (2 Level)
• DOE Response Surface
• Variation - Monte Carlo
• Variation - Latin Hypercube
The first four strategies in the list reference attributes specified in the Settings tab of the Factor form. The
two Variation methods reference attributes in the Variation tab of the Factor form.
Study - Perimeter
This method is used to evaluate the relative robustness of an analytical model. This method is often called
Processes Health Check. The system under investigation is exercised at three different configurations:
• In the first trial, all the factors are set to their respective minimum values.
• In the second trial, the factors are set to their intermediate value.
• In the third trial, the factors are set to their respective maximum values.
When first investigating a system, it is good practice to determine the relative robustness of the nominal
simulation. The first step in this process is to make sure that the nominal configuration runs well. The
next step is to determine the likelihood that variants, of the nominal configuration, will run well. You can
use the perimeter study to run three different configurations, which span the design space. The successful
running of these three configurations will build confidence that you are working with a robust simulation.
Before submitting a series of simulations which you may expect to run overnight, it is important to run a
perimeter study to verify that the basic mechanics of building, running, and postprocessing the analytical
system performs as expected.
Design Specification 9
Designs
If you choose the Perimeter Study as the Investigation Strategy, the model type will be automatically set
to None and there will be no option for fitting the results or subsequently publishing or optimization. This
Investigation Strategy is used to determine the relative robustness of the simulation and the simulation
process.
Note: If you select one factor for a Sweep or Perimeter Study, you can fit a model to the results.
If you have more than one factor, you cannot fit a model, so the Model option in the Design
Specifications form is set to None
Study - Sweep
This method alters the respective inputs over a range. For example, let's say you wanted to alter the initial
velocity of a vehicle from 50 to 100 KPH. You would define the initial velocity as a factor with settings
50 and 100. If this is the only factor in the investigation, you could select the sweep study investigation
strategy in the Design Specifications form. The Number of Runs specifies how the factor interval will be
divided. If you specified six trials, then the simulation would be run at 50, 60, 70, 80, 90, and 100 KPH.
If only one factor is in the Factor Candidates List for a sweep study, then you can potentially fit the
regression model. If more than one factor has been promoted than the Sweep Study permits, the None
option for a regression model type and no subsequent model fitting or publishing of a fit model will be
available. Sweep Studies are sometimes referred to as design studies.
Note: This investigation strategy creates a collection of points, which approximates the specified
distribution with fewer trials than the Variation - Monte Carlo method.
Model
In performing a regression analysis, the objective is to fit an equation (referred to as the model) to the
data such that the error between the values predicted by the equation and the actual observed values is
minimized.
The model can have a constant term, linear terms, quadratic terms, and cubic terms. For example, if there
are two factors, the forms are as shown below:
Type: Form:
Linear R = a1 + a2*F1 + a3*F2 + e
Interactions R = a1 + a2*F1 + a3*F2 + a4*F1*F2 + e
Quadratic R = a1 + a2*F1 + a3*F2 + a4*F1*F2 + a5*F1^2 + a6*F2^2 + e
R = a1 + a2*F1 + a3*F2 + a4*F1*F2 + a5*F1^2 + a6*F2^2 + a7*F1*F2^2 + a8*F1^2*F2
Cubic
+ a9*F1^3 + a10*F2^3 + e
where:
Design Space
The design space is a matrix with the rows representing the run and the columns representing the factor
settings. The settings are in a normalized representation. Learn more about the Design Space form.
Inclusions
Adams/Insight enables you to import a full or partial design matrix whose factor settings will be included
when the complete workspace is generated. This is only applicable for D-Optimal design types.
12 Adams/Insight
Factors
Factors
A factor is a variable that you want to vary in your experiment. Factors are the inputs to a system, such
as geometric point location user-defined element parameter or a Design Variable. You define your factors
using the Factor Form.
Tying Factors
• Overview
• Procedure
Overview
When you create a tie, you specify the type of the tie, and the corresponding scale or offset value for each
tied factor. Adams/Insight then computes the value of the tied factors from the value of the tie.
If the tie type is Scale, Adams/Insight uses the following formula:
current factor value = current tie value * component factor scale value
where:
• Current factor value is the value the factor becomes during the current trial.
• Current tie value is the value the tie assumes for the current trial.
• Component factor scale value is the value entered in the Tied Factors table in the tie Factor Form.
4. Review the other factor properties, such as Nominal Values and Settings. If necessary, change
them to the desired values.
5. Select the Tie tab.
6. Select Scale or Offset for Tie Type.
7. Enter a scale or offset for each tied factor, then select Apply.
Adams/Insight updates the computed values for the tied factors.
When you create the tie, Adams/Insight sets default values for the tie value, tie settings, and scale values
for the tied factors. If the tied factors have the same value, Adams/Insight automatically moves the value
and settings to the tie and sets all scales to 1. If the tied factors have the same absolute values,
Adams/Insight automatically makes the tie the positive value, and sets all scales to +1/-1. Otherwise,
Adams/Insight sets the tie value to 1 and the tied factor scales to the tied factor nominal values.
To untie factors:
1. Select the tie in the treeview.
2. Select the Untie factors tool.
3. Adams/Insight displays an alert asking whether you want to restore the original factor values.
4. Select Yes or No.
Adams/Insight deletes the tie and moves the tied factors back to the top level of the treeview. If
you asked to restore the original factor values, Adams/Insight resets the factor nominal values to
their original values (before they were tied). Otherwise, the factors retain the nominal values
computed from the tie.
See examples.
Examples
This section contains the following examples for tying factors:
• Example 1: Two Symmetric Factors
• Example 2: Three Scaled Factors
• Example 3: Symmetric Points
In this case, Adams/Insight creates Tie_01 as a scalar tie with the attributes of the first factor, factor_01.
factor_01 and factor_02 are now children of Tie_01. Their scales are +1 and -1, respectively. Their value
is dictated by:
f_01.value = Tie_01.currentValue * f_01.scale
f_02.value = Tie_01.currentValue * f_02.scale
The scales are the only attributes of the tied factors that Adams/Insight now uses in the experiment. The
nominal values, settings, and tolerances of the tied factors are now ignored.
When Tie_01 is set to the low value, then:
f_01.value = (Tie_01.nominalValue + Tie_01.settings.low) * f_01.scale
f_01.value = (200 + (-10) ) * 1
f_01.value = 190
Note: The above example assume a Relative setting of the Tie; however, the same expression is
valid for Absolute or Relative_Percent.
Tie_03.right_pt.z.scale = 1
Response Types 17
Responses
Responses
A response can be considered the output, design objective, or measurement of interest. In a Design of
Experiment (DOE), you monitor or measure the response after each Trial evaluation. After adequate trials
have been completed, you attempt to numerically establish a function relationship between the inputs
(factors) of the system and the outputs (responses) of the system.
If successful, a response evaluates to some function and the independent variables are the factors or
inputs to the system. A scalar response is a type of response which returns a single value of interest.
Response 01 = R_01 (f1, f2, f3, ... fn)
This function could be a linear function or a higher-order function. The following example demonstrates
a quadratic response with three factors. The Adams/Insight fit utility computes the constant and
coefficients
Response Types
There are two types of responses in Adams/Insight:
• Scalar
• Composite
Scalar Response
A scalar response is a type of response which returns a single value of interest.
Response 01 = R_01 (f1, f2, f3, ... fn)
This function could be a linear or higher order function. The following example demonstrates a quadratic
response with three factors. The Adams/Insight Fit utility computes the constant and coefficients as
follows:
Composite Response
A composite response consists of N number of scalar responses. When evaluated together, this group of
scalar responses can produce a continuous representation of a measurement. A composite response
enables you to reserve more than one column per response in the work space matrix. Traditionally you
would expect one column per response in the work space matrix when responses represent a scalar value
for each Trial. By altering the Columns field in the response attribute form you can reserve any number
of additional columns. These Columns are then named <response abbr> (0), <response abbr> (1),
<response abbr> (2), ... <response abbr> (n). Composite response member elements could be used to
store polynomial representation of a curve by putting the constant and subsequent coefficients in the
respective columns.
For example, the following composite response represents a cubic polynomial. A cubic polynomial
consists of a constant and three coefficients; therefore, the four scalar responses. In this example, the four
scalar responses are a function of three factors:
Now, if you vary x over a range, you can visualize the resulting curve.
This next example is a Composite Response representation of a quadratic polynomial. A quadratic
polynomial consists of a constant and two coefficients; therefore, the need for three scalar responses. In
this particular case, the composite response elements are dependent on six factors.
• curve(m, n, o) = m + (n*x) + (o*x^2)
• Response 11 (0) = R_11 (0) (f1, f2, f3, f4, f5, f6) = m
• Response 11 (1) = R_11 (1) (f1, f2, f3, f4, f5, f6) = n
• Response 11 (2) = R_11 (2) (f1, f2, f3, f4, f5, f6) = o
Response Types 19
Experiments in Adams/Insight
Experiments in Adams/Insight
In Adams/Insight, a designed experiment helps you gain understanding and improve a system or
subsystem model. The components of the experiment include:
• System model - such as a multi-dynamic simulation
• Factors - inputs to the system
• Responses - outputs or performance metrics of the system under investigation
In the experimental design process, you systematically modify factors in your model and monitor
responses after each Trial.
The design matrix does not directly specify factor values. Instead, it specifies indexes to the Levels for
each factor. The indexes center on zero. This means that for a two-level factor, the only possible values
are -1 and +1; for three-levels, -1, 0 and +1; for four-levels, -2, -1, +1, +2; and so on.
This convention implies that the levels (allowed values or range of values) are ordered from smallest to
largest, and cover a range above and below a baseline value. For example, if a factor has three levels, you
can think of the -1 index as the low value, the 0 index as the middle or baseline value, and the +1 index
as the high value. In Adams/Insight, we recommend that you list factor values from smallest to largest.
20 Adams/Insight
Exporting Data
Exporting Data
In Adams/Insight, you can export various components to other experiments. You can export the
following:
• Design Space
• Work Space
• Full Work Space
• Model
To export a component:
1. From the File menu, point to Export, and then select the menu item for the component you want
to export.
A file selection dialog box displays.
2. Select the file to which you want to export the components.
3. Select OK.
Response Types 21
Importing Data
Importing Data
In Adams/Insight, you can import various components of other experiments. You can import the
following:
• Full Work Space
• Inclusion Design
• Results
• Work Space
To import a component:
1. From the File menu, point to Import, and then select the menu item for the component you want
to import.
A file selection dialog box displays.
2. Select the file that contains the components you want to import.
3. Select OK.
22 Adams/Insight
Reusing Components
Reusing Components
In Adams/Insight, you can reuse many components of existing experiments. When you reuse
components, Adams/Insight updates your current experiment to use as many of the attributes and settings
of the old experiment as possible. As a result, you can efficiently rerun an old experiment if the new
experiment has few changes. Learn how to reuse components.
You can reuse the following:
• Factors
• Responses
• Design specifications
You can also reuse all factors, responses, and design specifications from an experiment.
When you run Adams/Insight with the reuse option, either from the command line with -reuse or from
Adams/View using the Reuse Experiment option, Adams/Insight loads the experiment and reuses the
specified old experiment. If Adams/Insight can recreate a valid workspace, it does so and then
immediately exits without displaying the Adams/Insight window. This allows you to use Adams/Insight
in a batch mode, either from an Adams/View .cmd or from the command line. As long as the experiment
has the same number of factors and the factors have the same general characteristics, you can use
Adams/Insight to regenerate the workspace and immediately exit. Some common uses of this feature are
changing factor nominal values or limits, or adding or removing responses.
Note: Reuse only updates the information in the target experiment with matching information
from the referenced experiment. For example, if the referenced experiment has additional
or unique responses that do not exist in the target experiment, they are not brought in.
Reusing Factors
When you reuse factors, Adams/Insight:
• Promotes any factors that match old inclusion factors by name.
• Creates ties in the inclusion factors (if there are factors matching old tied factors by name and
type).
• Updates most attributes for inclusion factors that match old inclusion factors by name and type.
Reusing Responses
When you reuse responses, Adams/Insight:
• Promotes any responses that match old inclusion responses by name.
• Updates most attributes for inclusion responses that match old inclusion responses by name and
type.
Response Types 23
Reusing Components
To reuse a component:
1. From the File menu, point to Reuse, and then select the menu item for the component you want
to import.
A file selection dialog box displays.
2. Select the experiment file that contains the components you want to reuse.
3. Select OK.
24 Adams/Insight
Simulation Properties
Simulation Properties
Displays information on the simulation used for your experiment.
Working with Results
Response Surfaces and Fitting Results
A response surface is a mathematical surface represented by a series of polynomials. It gives an
approximate value of the response (dependent variable or objective) as a function of the factors
(independent variables or Design Variables). The techniques you use to create and analyze response
surfaces are collectively called Response Surface Methodology (RSM). RSM is widely used for
developing and optimizing processes and products of all kinds (see References).
Adams/Insight computes the least-squares fit of the polynomial when you use the Fit results tool. In
statistical terms, Adams/Insight performs a multiple linear regression of polynomial models. It computes
standard analysis of variance (ANOVA) statistics for the fit, and provides a large set of ANOVA statistics,
like R2 and R2adj, to help you assess quality of fit.
Adams/Insight can export the response surface polynomial as an HTML Web page or SYLK format
spreadsheet file.
You can use response surfaces as a simplified model of a system. For example, you can use the HTML
page or SYLK file to quickly predict the effects of changing factors in your design matrix. Load the
HTML page in a browser or the SYLK file in a spreadsheet program, and modify values for factors to
the change in estimated response.
You can also use the response surface to estimate an optimal design. Because it is much quicker to
evaluate a polynomial than run a full series of simulations, optimizing estimated response is a quick way
to get an approximate optimum. You can use data in the SYLK file to do this in a spreadsheet application,
such as Microsoft Excel™.
R2 (R-Squared) indicates how well the response surface represents the results. It is the square of the
multiple correlation coefficient (R). R2 is the fraction of variability in the data for which the model
accounts. The larger it is, the better the fitted equation explains variations in the data. R2 is between 0
and 1. If R2 = 1, the equation exactly matches the data. A high R2 (.9 for example) indicates a good, but
not exact, fit. A low R2 (.3 for example) indicates a poor fit.
High R2 values can be deceiving. Adding more terms to the equation almost always increases R2. If you
add enough terms, you can always achieve an exact fit. However, you usually want the most efficient fit:
the fit that gives the best results with the fewest terms.
2 Adams/Insight
Because of this, it's useful to look at R2adj (Adjusted R-Squared), which is similar to R2 but is adjusted
to account for the number of terms. Adding terms does not always increase R2adj. If you add unnecessary
terms, R2adj often decreases. If R2 is much higher than R2adj, it indicates that at least one of the terms is
not as useful as the others and could probably be removed without hurting the fit. You find which term
to remove by further examining the results, or by trial and error.
Even after checking R2adj, a high R2 does not always mean you have a suitable response surface. At a
minimum, you should review Residuals. Residuals are differences between original response values and
estimated values. In other words, a residual is the amount by which a fitted surface misses an original
value. Adams/Insight provides residuals for each Trial.
If a trial has an unexpectedly large residual, it could indicate that the trial is an outlier, meaning that it
might not be consistent with the other runs. Perhaps something unexpected happened or there was a
simulation error during the run. Review the results of that run, looking for unusual behavior or results. If
necessary, correct the model so that all runs complete successfully and consistently.
Large residuals can also mean that data are irregular and difficult to fit. Review your objective function
and values, looking for sudden changes in value or the slope of values. Gaps or cusps in objective values
cause poor fits. If necessary, adjust your objective function to produce smoother values.
If the runs seem consistent and objective values vary smoothly, then large residuals probably mean the
polynomial is just not a good fit and you should add more terms or fit across a smaller range of values.
When evaluating the fit, if the R2 and R2adj are red, review the workspace matrix as follows:
• Verify that all of the runs completed successfully.
• Review the residuals and determine if there is a pattern in actual versus estimate.
• Refine the model to improve the fit.
• See if the factors you selected have any impact on the response (review Term Significance in the
Terms form).
• Check the Error DOF in the fit summary.
Refinement of a Fit
Following are the typical steps of refining a model:
1. Fit regression model
2. Check R2 and interpret ANOVA table
3. Verify residuals plots
4. Remove outliers, if needed
5. Remove terms, if needed
6. Check R2 and interpret ANOVA table
7. Transform response, if needed
Working with Results 3
Exporting Results
You can export your results to a file using these file formats:
• HTML - HTML-format Web page
• SYLK - Symbolic Link (SYLK) format spreadsheet file
• Visual Basic - Visual Basic subroutines
• MATLAB - MATLAB M-File
The SYLK and HTML formats show you a table of the responses and factors where you can change
variable values, and automatically compute new estimates. To do this, display the HTML page in a Web
browser enabled to read JavaScript, or load the SYLK file into a spreadsheet program, such as Microsoft
Excel. The SYLK format is a convenient way to transfer response surface equations to a spreadsheet
program for further study.
The Visual Basic format file contains Visual Basic subroutines to compute the responses.
The MATLAB format file contains MATLAB matrices that can be used to compute the responses.
HTML: Factor
The factor section lists each factor in the design matrix. Each row shows information about one factor
including the factor name, units, current value, tolerance (optional), minimum, nominal, and maximum
value.
You can modify the current value by typing a number in the text box or by selecting the
increment/decrement buttons. After entering a value in a factor current value text box, you must select
the Update button or press Enter to see the response effect.
HTML: Plot
The plot section lists the composite responses only. You can make changes to the values in this section
and press Update Plots to redraw the plots in the separate window. You can also check the Swap XY
check box to invert the x and y axes.
Working with Results 7
HTML: Response
The response section lists each response in the design matrix. Each row shows information about one
response including the response name, units, and current value.
Main Effects
Main effect refers to the primary effect of a factor. A good way to examine the main effects is through a
Pareto chart.
The Adams/Insight .htm file computes main effects on the fly using JavaScript.
The displayed main effect of a factor is the difference between the response at the factor maximum value
and the response at the factor minimum value, while all other factors are at their average values. Effects
may be positive (response increases with larger factor value) or negative (response decreases with larger
response value).
Note that the minimum and maximum factors' values do not necessarily produce the minimum and
maximum response values. If a response is highly nonlinear over the factor value range, the minimum
and/or maximum response values may be in the middle of the curve. In this case, the main effects values
are meaningless.
The effect % is the ratio of the effect value to the response value with all factors at their average values.
An effect % greater than 100% means that the variation in the response value is larger than the average
response value.
The effects are sorted largest to least absolute value. The longest bar is always the same length. The other
bars are proportional to the largest based on the effect value relative to the largest value. Positive effects
have a dark blue bar, negative effects have a light blue bar.
8 Adams/Insight
SYLK: Factor
The factor section lists each factor in the design matrix. Each row shows information about one factor
including the factor name, units, current value, tolerance, minimum, and maximum value.
You can modify the current value by typing a number in the text box. After entering a value in a factor
text box and pressing Return or Tab, you can see the response effect.
SYLK: Response
The response section lists each response in the design matrix. Each row shows information about one
response including the response name, units, current value estimate, and tolerance.
SYLK: Tolerance
The Tolerance Contributions table provides the percent contribution of each factor to the tolerance of
each response. A high value means the factor tolerance greatly contributes to the response tolerance. The
response tolerance and tolerance contributions vary in both factor values and factor tolerance values. For
more information, click on Tolerance Contributions in the left pane of this window.
Tolerance
The tolerance value can be initially specified as one of the factor attributes. If any of the factors have a
nonzero tolerance attribute, the published Web page will present this value and the responses will have a
tolerance computed for each time a factor value is modified.
The computed tolerance reflects the same amount of variation as the factor tolerance values. For
example, if you enter factor tolerances that are three times the standard deviation, then the computed
response tolerance will be three times the standard deviation of the response.
Note that the tolerance calculation always assumes a normal distribution for factor variations. This is true
even if you have selected None or Uniform for Monte Carlo Distribution in the Factor form.
Adams/Insight only uses the Monte Carlo Distribution setting for Monte Carlo experiments, not for the
tolerance calculations in the exported HTML file.
The method Adams/Insight uses to compute the response tolerance is described in several papers. A
specific reference is "A New Tolerance Analysis Method for Designers and Manufacturers" by
Greenwood and Chase. See References for more details.
The assumptions of this computation are:
Working with Results 11
Tolerance Contributions
The Tolerance Contribution table shows the relative contribution by each factor to the variation
(tolerance) in each response. The contributions are rounded to the nearest percent. The values in each row
add to 100%, plus or minus a few percent due to the rounding.
A high value indicates that the factor variation greatly contributes to the response variation. A low value
indicates that the factor does not contribute much to the response variation. A value of zero indicates
either that the factor does not affect the response at all, or that the variation in the factor has only an
insignificant effect compared to the other factors.
The contribution values only show relative importance, they do not directly indicate how much the
response variation will drop if the factor variation is eliminated. For example, if there are two factors and
each contributes 50% to the response variation, eliminating the variation of one factor will not cut the
response variation in half. Instead, it will reduce it by about 30%.
This is because the total response variation is the square root of the sum of the individual factor
contributions squared. The percentage contribution is calculated as the ratio of the factor contribution
squared to the sum of all the contributions squared.
12 Adams/Insight
Using Adams/Insight Tools
2 Adams/Insight
Workspace
Workspace
Scatter Plots
Scatter plots allow you to view the raw data plotted against another variable. You can also plot the raw
values against the trial count.
Note: • The following are predefined variables in the Work Space Column Calculator:
• rtod for radians to degrees conversion
• dtor for degrees to radians conversion
• curRow for the integer value of the current row (this is a zero-based index)
• Another class of predefined variables is the factor values. To access these values,
use the factor abbreviation.
• Correlation values below 0.5 generally indicate a chaotic relationship between the variables.
• Nonlinearities of the system may confuse correlation index.
Work Space
The Work Space is a matrix with the rows indicating the runs and the columns identifying the factor
settings and resulting response values in engineering units. It is sometimes referred to as the run matrix.
Learn about the Design Work Space form.
Examples
Following are examples of using the Work Space Column Calculator in Adams/Insight.
Example 1
If the values of response r_01 were measured in miles per hour (mph), you could convert the raw data to
kilometers per hour (kph) as follows:
1. In the Work Space Column Calculator, set Column to Compute to r_01.
2. Set the Expression area to r_01*1.609344.
3. Select Apply or OK.
4. The updated values display in the r_01 column.
Example 2
Synthesize a relationship between three factors and a response
1. Create three factors and one response.
2. Create a workspace matrix.
3. Open the workspace column calculator and set Column to Compute to r_01.
4. Enter the following expression: '2 + 3*f_01 + 4*f_02**2 + 5*f_03**3'
5. Select OK.
4 Adams/Insight
Workspace
Example 3
Use of Python functions in the expression
1. Create two factors and one response.
2. Create a workspace matrix.
3. Open the workspace column calculator and set Column to Compute to r_01.
4. Enter the following expression: '2 + 3*f_01 + 4*f_02**2 + 5*f_03*f_02 + random()'
5. Select OK.
Using Adams/Insight Tools 5
Optimization
Optimization
Using Adams/Insight you can optimize your factor values based on a response surface (fitted regression
model). If your experiment uses a simulation conduit that supports direct execution, you can also directly
optimize your experiment using simulations. During optimization, Adams/Insight automatically adjusts
the factor values so that the resulting responses come as closely as possible to the specified target values.
To optimize a response surface, select a model under Analysis in the tree view, then select Tools-
>Optimize Model or the Optimize tool in the toolbar. Adams/Insight displays the Optimize Model or
Experiment dialogbox. Adams/Insight uses the fitted model to estimate the response values.
Using Adams/Insight, you can perform Single-Objective Optimization and Multi-Objective Optimization.
Single-objective optimization involves trying to achieve a target for one scalar response; multi-objective
optimization involves more than one scalar response. If you choose more than one response as objectives,
Adams/Insight will calculate a multi-objective cost based on the objective options, targets, weights, and
multi-objective method option. Adams/Insight will then minimize the overall multi-objective cost.
If you want to do a direct optimization using simulations, it is a good idea to first create a response
surface, optimize that, then use the Save button in the Optimization Form to save your settings and the
optimal point as the new defaults. Then, when you do the direct optimization, the response minimum and
maximum values will be good estimates, and the starting point for the direct optimization will be a good
starting point to find the true optimum. If you do not do an initial response surface and optimization, be
sure to specify appropriate response minimum and maximum values in the Response form, in the
Optimization tab. For more information on optimizing in Adams/Insight, refer to Optimizing Results in
Using Adams/Insight with Adams/View.
6 Adams/Insight
Preferences
Preferences
You can define the design, fit, and thresholds preferences for your experiment
.
Note: Optimization preferences are set in the Optimization Preferences dialog box.
To set preferences:
1. From the Edit menu, select Preferences.
2. Select the Design, Fit, and/or Thresholds tab, and enter your preferences as described in the
Preferences Dialog Box.
3. Select OK.
Using Adams/Insight Tools 7
Refinement
Refinement
After fitting a model in Adams/Insight, it is important to evaluate the quality (or "Goodness") of the fit.
If the fit does not meet your criteria, you can refine it using Adams/Insight.
You can perform a manual or automatic refinement.
Manual refinement
To manually refine the fit of your model:
1. From the Tools menu, point to Refine Model Manually, and then select one of the following:
• Remove Outliers: Select Outliers to remove from your experiment.
• Remove Terms: Select terms to remove from your experiment.
• Transform Response: Specify the type of transformation for your response.
• Change Order: Specify the model order for your experiment.
2. Enter the information as described in the dialog boxes:
• Refinement - Remove Outliers
• Refinement - Remove Terms
• Refinement - Transform Response
• Refinement - Change Order
3. Select OK.
4. Repeat steps 2 and 3 until no other terms can be added or removed, or until a term combination is
repeated.
Adams/Insight repeats the stepwise refinement on each regression in the specified model.
Depending on the hierarchy option you choose, Adams/Insight may add or remove individual terms, or
only groups of terms. Hierarchy refers to the relationship between polynomial terms, and whether or not
some terms should be added or removed before others.
At each step, the stepwise refinement evaluates the terms or sets of terms that could be added or removed
based on hierarchy. It evaluates them using your choice of F or P value, and your specified threshold
value to add or remove terms. It first evaluates all possible terms or sets of terms to add. If the best term
or set is more significant than your specified threshold, it will add the term or set. If not, no term is added.
Then, it evaluates all possible terms or sets of terms to remove. If the worst term or set is less significant
than your specified threshold, it will remove it. If not, no term is removed.
The refinement ends when Adams/Insight cannot add or remove any more terms, or when a term
combination is repeated.
Many of the term evaluations require a new computed fit. As the number of terms increases, not only
does the number of evaluations increase, but the time to compute the fit increases as well. For large
models, stepwise refinement may be quite slow, especially with no hierarchy.
The following tutorials teach you how to use Adams/Insight with other products
:
Overview
This chapter introduces you to the tutorial and gets you started. The tutorial demonstrates how the
Adams/Insight and ASCII conduit can be used in conjunction with Adams/Solver simulation files.
The sections in this tutorial are:
• About this Tutorial
• About Adams/Insight with ASCII Conduit
• Getting Started
• Parameterizing the System
Getting Started
Here you will create your working directory and copy over the necessary files.
Note: On Windows, you may need to set the permissions to Full Control to edit the tutorial files.
To get started:
Note: You can skip steps 1-3 below if you previously used the Help ‡ Copy Examples to feature
to copy all of the tutorials for Adams/Insight.
1. Create a working directory called ain_examples/asc. This directory will contain all of the files
for this tutorial.
2. Copy the following files from <install_dir>/ainsight/examples to the newly created working
directory:
• ain_tut_101_asc_adm.acf
• ain_tut_101_asc_adm.adm
3. Copy the two Adams simulation files by performing one of the following from a command
prompt in the ain_examples/asc directory:
• On Windows:
4 Adams/Insight
Using Adams/Insight with the ASCII Conduit (ASC)
For example, if you have a string such as, 'PART/02 , MASS = 160.0, CM = 0203' and
you want to alter the mass of the part, you would first highlight the numeric value of '160.0',
and then right-click and assign the highlighted text to a variable (by either creating a new
variable, or referencing an existing variable displayed in the shortcut menu). When creating a
new variable, you can define the following:
• Name: Descriptive name of the variable.
• Format: Controls how the value will be printed. The convention follows the C printf()
convention of %d for integer, %f for float, %e scientific, and
%s for string. On UNIX, use the man printf command to get more information on this
numeric formatting convention. In most cases, you don’t need to modify the default value.
• Value: Default value that was originally highlighted.
• Description: Optional supplemental information regarding the particular variable.
Note: The double curly brace delimiters '{{' '}}' that appear in the text file are the default delimiters. You
can change them on a template-by-template basis in the template properties.
Note: Be sure that there are two underscores between the closing curly brace and bungy.
The parameterization of the bungy.adm ASC template is complete. Now, you parameterize the
analysis names specified in the bungy.acf ASC template (first you open the appropriate ASC
template and then parameterize it).
20. From the Template menu, select bungy.acf.
21. Parameterize the first two lines of the bungy.acf ASC template so they look like the following:
{{ascTrialName}}__bungy.adm
{{ascTrialName}}__bungy
Note: Be sure that there are two underscores between the closing curly brace and bungy.
22. Specify ASC template properties specific to the bungy.acf ASC template: from the Edit menu,
select Template Properties.
The Template Properties dialog box appears.
23. Now you will specify how the simulations will be run. Do this by completing the following text
boxes:
• On Windows:
• Execution Prefix: mdadams2010 ru-s
• On UNIX:
• Execution Prefix: mdadams2010 -c ru-stan i
• Execution Postfix: exit
24. Enter a Python dictionary definition in the Post Operations (Dict) text box. For this example,
copy and paste the following:
{'total_length':'adams_r.Fetch_req(1,3,"max")','max_acc':'adams_r.Fetch_req(1,4,"max")'}
This string specifies what happens after the simulations are complete. Specifically, what parts of
the simulation results files will be interrogated or how the postprocessing will occur. The results
are retrieved from the solver files and specific values are placed in the Work Space.
Note: You can also see an example of this string on a commented-out line in the bungy.adm file.
Creating and Running an Experiment 7
Using Adams/Insight with the ASCII Conduit (ASC)
Note: Look in the window that you used to start the Adams/Insight ASC editor for warning
messages. Make corrections as necessary.
Overview
In this chapter, you’ll create an experiment and run through a number of trials that you set up in the
experiment.
The sections in this tutorial are:
• Creating the Experiment
8 Adams/Insight
Using Adams/Insight with the ASCII Conduit (ASC)
Note: For more information on the options available with the _mer.py file, execute one of the
following commands:
2. To test the configuration and only run the first trial, enter one of the following commands:
• On Windows: mdadams2010 python j_asc_exp_mer.py
• On UNIX: mdadams2010 -c python j_asc_exp_mer.py exit
The results of this operation are placed in a subdirectory with the default prefix of tst.
3. To run all trials, enter one of the following:
• On Windows: mdadams2010 python j_asc_exp_mer.py -t
• On UNIX: mdadams2010 -c python j_asc_exp_mer.py -t exit
To check that the results files were created, change to the tst_dir subdirectory and view its
contents.
Note: You can also use the toolbar icons to run the trials.
Overview
This section introduces you to the tutorial and gets you started. The tutorial guides you through the
process of using Adams/Insight with Adams/Car to investigate transient dynamic response of a vehicle
front-suspension model.
The sections in this tutorial are:
• About the Tutorial
• Starting Adams/Car
• Creating the Model
• Adams/Insight Interface
• Determining if exaggerated
Toe angle is the projected angle the changes in toe angle result in
wheel plane makes with the ground aggressive tire wear.
when viewed from above the vehicle. • Assessing your model against a
Toe-in is considered positive, and toe- manufacturing variation.
out is considered negative.
• Assessing packaging requirements
for your model.
Starting Adams/Car
The section provides instructions on how to start Adams/Car on UNIX and Windows.
Note: On Windows, you may need to set the permissions to Full Control to edit the tutorial files.
Note: You can skip this step if you previously used the Help‡Copy Examples To feature to copy
all of the tutorials for Adams/Insight. Your working directory is ain_examples/acar.
Note: If you previously used the Help‡Copy Examples To feature to copy all of the tutorials for
Adams/Insight, your working directory is ain_examples/acar.
5. Select OK.
Running a Simulation
Before you create your experiment, you’ll simulate the suspension model in Adams/Car to run a baseline
parallel travel analysis. There are three important reasons to run a simulation before beginning your DOE
analysis. They are:
• Running a simulation sets up the assembly for the type of analysis you will perform in
Adams/Insight. This is important because the topology of the assembly can change slightly
depending on the type of analysis performed.
• Running the simulation creates a simulation script that you use in the Adams/Insight experiment.
• You need to determine whether or not you can analyze the assembly in its current configuration.
14 Adams/Insight
Using Adams/Insight with Adams/Car
Note: Adams/Car places the full object hierarchy as part of the name in the Result Set Comp. text
box.
5. Set Design Objective’s value is the to maximum absolute value during simulation.
You are interested in the maximum value of the toe because this is the value that you want to
minimize as a result of your experiments.
6. Select OK.
Starting Adams/Insight
In this section, you’ll open Adams/Insight from Adams/Car and begin creating an experiment to measure
the performance of a suspension model.
Introducing the Tutorial 15
Using Adams/Insight with Adams/Car
Adams/Insight Interface
This section describes what you see when Adams/Insight first opens. Figure 3 shows the main window
as it appears when you first launch Adams/Insight. It includes the following items:
• Menu bar - Contains pull-down menus for File, Edit, Define, Simulation,
Tools, and Help.
• Toolbars - Contain commonly used tools for accessing files, and creating and modifying
designed experiments.
16 Adams/Insight
Using Adams/Insight with Adams/Car
• Treeview - Displays a hierarchical list of objects that you can include in an experiment. The
tree is especially useful for selecting and identifying objects when you are creating a design
matrix.
• Viewport - The area of the window that displays parameters for modifying the objects you
select from the treeview.
• Status bar - Displays messages and issues prompts during your Adams/Insight session.
Adams/Insight Toolbars
The Adams/Insight main window has four toolbars:
• Main (Experiments) toolbar - Lets you execute basic commands.
• Adams/Insight (Experiments Contents) toolbar - Helps you build and execute your
experiment.
• Work Space toolbar - Lets you execute commands on the work space.
• Report toolbar - Lets you generate and export a report.
If you hold your mouse pointer over any tool, tip text appears giving a short description of the tool.
Tools in toolbars are arranged in the order that you’ll use them in the process of creating and executing
your designed experiment. Depending on where you are in the process of creating an experiment,
Adams/Insight enables or disables the tools (you can always display and undisplay them if you need to).
This feature alerts you to the correct order of procedures to follow. For example, the Run simulations tool
is disabled until you define required elements for a design matrix.
For more information on the toolbars, see the Adams/Insight online help.
Creating and Running an Experiment 17
Using Adams/Insight with Adams/Car
Overview
In this section, you’ll create a design matrix and run a model through a number of simulations that you
set up in the experiment.
The sections in this tutorial are:
• Creating a Design Matrix
• Running Your Experiment
Promoting Candidates
The first step required to creating your designed experiment is to select the factors that you want to
include in your design matrix. You select factors from the Candidates list in the treeview, and then
promote them to the Inclusions list. Promoting candidates to inclusions causes them to become part of
your design matrix.
Note: The treeview displays the full object hierarchy for each factor. This tutorial will only refer
to the variable name. For example, the variable hpl_tierod_outer.x appears as
TR_front_Suspension.ground.hpl_tierod_outer.x in the treeview.
3. Select the candidate, hpl_tierod_outer.x, and then move your cursor to the Adams/Insight
toolbar and select the Promote to inclusion tool .
The candidate hpl_tierod_outer.x moves to the Inclusion list under Factors in the treeview.
Tip: To select more than one factor, hold the Ctrl key as you click. To promote the factors
directly from the treeview, press the shortcut key F5.
Creating and Running an Experiment 19
Using Adams/Insight with Adams/Car
• Type: Continuous
• Delta Type: Relative
• Settings: -5, 5
Promoting Responses
Now that you have finished promoting and modifying your factors, the next step is to promote your
responses for the experiment.
Hint: You can click the minus (-) sign in front of Factors to collapse that section of the treeview
and save screen space.
Modifying Responses
The modifications you’ll make to the responses are minor. You’ll add units and change one of the
parameters. To learn more about response parameters, press the F1 key from the Response form.
To modify responses:
1. In the treeview, under Responses, in the Inclusions list, select the response, toe_angle_objective.
The Response form appears, shown next, in the viewport.
22 Adams/Insight
Using Adams/Insight with Adams/Car
Note: Output characteristics are grayed out when you use Adams/Insight with Adams/View and
other Adams applications. The output characteristic is set by the originating CAE
application, and is displayed in the Response form for information only.
4. Select Apply.
Adams/Insight saves your response modifications.
5. Select the Define menu, point to Experiment Design, and then select Create Work Space.
Note: Clicking the Generate Work Space tool in the Adams/Insight toolbar performs Steps
4. and 5..
The Work Space appears in the viewport as shown in Figure 8. This table displays the work space matrix
for the fractional-factorial experiment that you defined earlier in the tutorial. Adams/Car will run a
simulation for each trial defined in this matrix. The column headings are sortable and sizeable. You can
also select Work Space Review to view summary information for each factor and response in your
experiment.
In the treeview, at the Design level, the letters D:W appear to indicate that the Design contains a
successfully generated design work space.
Note: Columns appear in the work space matrix in the order that you promote factors for inclusion.
Tip: Put your mouse pointer over column headings to display key information about the
abbreviation shown.
Adams/Car opens and runs the simulations defined by your experiment. The Message window
appears and displays standard Adams/Solver messages, which you can ignore for this tutorial.
Note: This procedure builds, runs, and postprocesses all of the simulations within the Adams/Car
session. We recommend that you break up the process flow into its separate phases using
the MDI INSIGHT BUILD and MDI INSIGHT LOAD commands. This is especially
important when you have more than 30 trials.
Overview
This chapter guides you through reviewing the results of your analysis; fitting your data to a polynomial
to determine which factors most affect model performance; and publishing results to an HTML or SYLK
file.
The sections in this tutorial are:
• Reviewing Results
• Fitting Results
• Publishing Results
Reviewing Results
After Adams/Car completes the trials defined in your design matrix, you return to Adams/Insight
interface to view the results.
To return to Adams/Insight:
1. From the main menu in Adams/Car, point to Simulate, point to DOE Interface, point to
Adams/Insight, and then select Display.
The Adams/Insight Display dialog box appears.
2. Verify the name of your experiment, and then select OK.
Adams/Car undisplays and the Adams/Insight window opens.
Simulation results from Adams/Car appear in the design matrix as shown in Figure 9.
Fitting Results
Now that Adams/Car has completed the trials defined in your work space matrix, you can use
Adams/Insight to fit your results to a polynomial or a response surface. The purpose of fitting your results
is to establish a relationship between the factors and responses that you selected for the work space
matrix. Fitting results includes a multiple regression. You will be able to investigate the parts of the
regression in the Summary, located in the treeview under Analysis, after completing the following steps.
For more information on this topic, refer to the Adams/Insight online help.
For definitions of the items in the results table, see online help.
The tables also provide you with a color code that indicates the soundness of your results:
Green indicates that all fit criteria meet or exceed highest fitting thresholds
Yellow indicates that the fit criterion may bear investigation
Red indicates that the fit criterion should be investigated
Publishing Results
Adams/Insight lets you save your findings as either HTML or SYLK files. Once saved, you can use either
a browser or spreadsheet program, such as Excel, to modify factors and see the effect on responses
without performing full simulations.
Working with Results 29
Using Adams/Insight with Adams/Car
2. Change the value for the first factor hpl_tierod_outer.x from 417 to 420, and then select Update.
The estimated responses adjust to reflect the new factor values. Notice that the value for the
response, toe_angle_objective, reflects a change.
3. You can continue to vary the factor values and investigate how changes to them affect your
responses. To learn more about analyzing the results of your experiment and publishing your
results to HTML or SYLK pages, refer to the Adams/Insight online help.
Working with Results 31
Using Adams/Insight with Adams/Car
Note: The check boxes only appear if you specified the corresponding data in your experiment.
For example, if you didn’t specify tolerances for your factors, the Contributions check box
will not display.
Overview
This chapter introduces you to the tutorial and gets you started. The tutorial demonstrates how to use
Adams/Insight with Adams/Chassis to perform a design of experiments (DOE) analysis. In this example,
you will see the effect of front spring rate, front stabilizer bar diameter, and front lower ball joint position
on understeer gradient.
The sections in this chapter are:
• About the Tutorial
• Starting Adams/Chassis
• Setting up the Investigation
• Adams/Insight Interface
Performance
Parameters you’ll attributes you’ll
modify: monitor: Description of event:
Front spring rate Understeer gradient The constant radius event is quasi-static
Front stabilizer bar diameter and simulates the vehicle under a series
of lateral acceleration levels.
Front lower ball joint position
Starting Adams/Chassis
You begin this tutorial by starting Adams/Chassis.
Note: On Windows, you may need to set the permissions to Full Control to edit the tutorial files.
• In the Current Working Directory text box, enter the name of the working directory you
created in step 1. You can use the browse tool or type the absolute path to the directory.
• Select Save.
Now, you will build and run the simulation, and then verify the results.
6. Select the Improve mode .
7. In the top section of the treeview, double-click achassis_gs_full_sys_swpt.
achassis_gs_full_sys_swpt moves to the Investigation Events folder in the bottom
section of the treeview.
8. From the Setup Investigation tab in the property editor, select Create New Investigation.
9. Leave all remaining defaults.
10. Select Go.
Adams/Insight opens with your experiment loaded.
Introducing the Tutorial 35
Using Adams/Insight with Adams/Chassis
Adams/Insight Interface
This section describes what you see when Adams/Insight first opens. Figure 3 shows the main window
as it appears when you first launch Adams/Insight. It includes the following items:
• Menu bar - Contains pull-down menus for File, Edit, Define, Simulation, Tools, and Help.
• Toolbars - Contain commonly used tools for accessing files, creating and modifying designed
experiments.
• Treeview - Displays a hierarchical list of objects that you can include in an experiment. The tree
is especially useful for selecting and identifying objects when you are creating a design matrix.
• Viewport - The area of the window that displays parameters for modifying the objects you select
from the treeview.
• Status bar - Displays messages and issues prompts during your Adams/Insight session.
Adams/Insight Toolbars
The Adams/Insight main window has four toolbars:
• Main (Experiments) toolbar - Lets you execute basic commands.
• Adams/Insight (Experiments Contents) toolbar - Helps you build and execute your experiment.
• Work Space toolbar - Lets you execute commands on the work space.
• Report toolbar - Lets you generate and export a report.
If you hold your mouse pointer over any tool, tip text appears giving a short description of the tool.
36 Adams/Insight
Using Adams/Insight with Adams/Chassis
Tools in toolbars are arranged in the order that you’ll use them in the process of creating and executing
your designed experiment. Depending on where you are in the process of creating an experiment,
Adams/Insight enables or disables the tools (you can always display and undisplay them if you need to).
This feature alerts you to the correct order of procedures to follow. For example, the Run simulations tool
is disabled until you define required elements for a design matrix.
For more information on the toolbars, see the Adams/Insight online help.
Overview
In this chapter, you’ll create a design matrix and run a model through a number of simulations that you
set up in the experiment.
The sections in this chapter are:
• Creating a Design Matrix
• Running Your Experiment
Promoting Factors
The first step required to creating your designed experiment is to select the factors that you want to
include in your design matrix. You select factors from the Candidates list in the treeview, and then
promote them to the Inclusions list. Promoting candidates to inclusions causes them to become part of
your design matrix.
Tip: To select more than one factor, hold the Ctrl key as you click. To promote the factors
directly from the treeview, press the shortcut key F5.
7. Enter -20,20 in the Settings text box, and then select Relative Percent as the Delta Type. This
sets the spring rate to vary from 80% to 120% of its nominal value.
8. Select Apply.
9. Expand properties, Front, achassis_gs_front_suspension, front_suspension, stabilizer_bar,
and achassis_gs_front_suspension_beamx_sta. Select the candidate
achassis_gs_front_suspension_beamx_sta_diameter.
10. Leave the Settings at their default values to modify the stabilizer bar diameter ± 4 mm.
11. Promote the candidate.
12. To vary the front lower ball joint position, expand properties, Front,
achassis_gs_front_suspension, front_suspension, lower_ball_joint, and then left.
13. Hold down the Ctrl key and select front_suspension_lower_ball_joint_left_x and
front_suspension_lower_ball_joint_left_y. Promote these factors.
Promoting Responses
The next step in defining the design matrix is to select response variables.
To promote responses:
1. In the treeview, select the + in front of Responses.
The levels nested under Responses expand to reveal Inclusions and Candidates.
Hint: You can select the minus (-) sign in front of Factors to collapse that section of the
treeview and save screen space.
2. Under Candidates, you’ll see a list of responses that are potential candidates you can include in
your design matrix. Expand e_001_achassis_gs_full_sys_swpt and then select
e_001_achassis_gs_full_sys_swpt_Roll_grad (for roll gradient) to display the Response form
shown in Figure 14 below.
Creating and Running an Experiment 39
Using Adams/Insight with Adams/Chassis
3. In the treeview, hold down the Ctrl key and select e_001_achassis_gs_full_sys_swpt_Roll_grad
and e_001_achassis_gs_full_sys_swpt_Understeer_grad (for understeer gradient), and
promote both candidates.
The responses move from the Candidates to the Inclusion list.
Note: Columns appear in the design matrix in the order that you promote factors for inclusion.
42 Adams/Insight
Using Adams/Insight with Adams/Chassis
Tip: Place your mouse pointer over column headings to display key information about the
abbreviation shown.
Overview
This chapter guides you through reviewing the results of your analysis, fitting your data to a polynomial
to determine which factors most affect model performance, and publishing results to an HTML or SYLK
file.
The sections in this chapter are:
• Reviewing Results
• Fitting Results
• Publishing Results
Reviewing Results
After Adams/Chassis completes the trials defined in your design matrix, you return to Adams/Insight
interface to view the results.
Fitting Results
Now that Adams/Chassis has completed the trials defined in your design matrix, you can use
Adams/Insight to fit your results to a polynomial or a response surface. The purpose of fitting your results
is to establish a relationship between the factors and responses that you selected for the design matrix.
Fitting results includes a multiple regression. You will be able to investigate the parts of the regression
Working with Results 45
Using Adams/Insight with Adams/Chassis
in the Summary, located in the treeview under Analysis, after completing the following steps. For more
information on this topic, refer to the Adams/Insight online help.
Note: The material in the following sections includes statistical terms related to DOEs. For
explanations of these terms, refer to the Adams/Insight online help.
For definitions of the items in the results tables, refer to the online help.
The tables also provide you with a color code that indicates the soundness of your results:
Green indicates that all fit criteria meet or exceed highest fitting thresholds
Yellow indicates that the fit criterion may bear investigation
Red indicates that the fit criterion should be investigated
Publishing Results
Adams/Insight lets you save your files as either HTML or SYLK files. Once saved, you can use either a
browser or spreadsheet program, such as Excel, to modify factors and see the effect on responses without
performing full simulations.
• Contributions - This check box appears if you specified a non-zero tolerance for any factor.
When present and selected, this check box displays the Tolerance Contributions table that
provides the percent contribution of each factor to the tolerance of each response.
• Stats - Displays R2, R2 adjusted, P, and R/V statistics for each response.
• Effects - For each response, displays effects caused by varying each factor from its minimum to
maximum value.
• Nonscalar - Displays composite responses in addition to the scalar responses.
• Plots- Opens a new window that displays a plot for each response.
• Info - This button displays a separate window that provides summary information about the
DOE parameters for the current page. It also provides Web environment information that is
valuable if you need to contact Adams technical support.
For more information on the controls and information provided by the HTML page, refer to the
Adams/Insight online help.
Introducing the Suspension Tutorial 49
Using Adams/Insight with Adams/View
Overview
This chapter introduces you to the suspension tutorial and gets you started. The tutorial uses a simple
automotive example to illustrate the basics of Adams/Insight. Even if you don’t have an interest in
automotive parts as a regular part of your job, we think you’ll find these instructions sufficient to help
you focus on the capabilities of Adams/Insight.
The sections in this chapter are:
• About the Tutorial
• Starting Adams/View
• Creating a Modeling Database
• Adams/Insight Interface
• Determining if exaggerated
Toe angle is the projected angle the changes in toe angle result in
wheel plane makes with the ground aggressive tire wear.
when viewed from above the vehicle. • Assessing your model against a
Toe-in is considered positive, and toe- manufacturing variation.
out is considered negative.
• Assessing packaging requirements
for your model.
Starting Adams/View
The section provides instructions on how to start Adams/View on UNIX and Windows.
Note: On Windows, you may need to set the permissions to Full Control to edit the tutorial files.
Note: You can skip this step if you previously used the Help‡Copy Examples To feature to copy
all of the tutorials for Adams/Insight. Your working directory is ain_examples/aview.
Introducing the Suspension Tutorial 51
Using Adams/Insight with Adams/View
2. Type the command to start the Adams Toolbar at the command prompt, and then press Enter.
3. Select the Adams/View tool .
The Adams/View main window appears.
Note: You can skip this step if you previously used the Help‡Copy Examples To feature to copy
all of the tutorials for Adams/Insight. Your working directory is ain_examples/aview.
2. From the Start menu, point to Programs, point to MSC.Software, point to MD Adams 2010,
point to Aview, and then select Adams - View.
The Adams/View main window appears.
Running a Simulation
Before you create your experiment, you’ll simulate the suspension model in Adams/View.
• In the Simulation Control dialog box, select the Start tool , and wait for the simulation to
finish.
52 Adams/Insight
Using Adams/Insight with Adams/View
Starting Adams/Insight
In this section, you’ll open Adams/Insight from Adams/View and begin creating an experiment to
measure the performance of a suspension model.
Adams/Insight Interface
This section describes what you see when Adams/Insight first opens. Figure 20 shows the main window
as it appears when you first launch Adams/Insight. It includes the following items:
• Menu bar - Contains pull-down menus for File, Edit, Define, Simulation,
Tools, and Help.
• Toolbars - Contain commonly used tools for accessing files, creating and modifying designed
experiments.
• Treeview - Displays a hierarchical list of objects that you can include in an experiment. The
tree is especially useful for selecting and identifying objects when you are creating a design
matrix.
• Viewport - The area of the window that displays parameters for modifying the objects you
select from the treeview.
• Status bar - Displays messages and issues prompts during your Adams/Insight session.
Introducing the Suspension Tutorial 53
Using Adams/Insight with Adams/View
Adams/Insight Toolbars
The Adams/Insight main window has four toolbars:
• Main (Experiments) toolbar - Lets you execute basic commands.
• Adams/Insight (Experiments Contents) toolbar - Helps you build and execute your experiment.
• Work Space toolbar - Lets you execute commands on the work space.
• Report toolbar - Lets you generate and export a report.
If you hold your mouse pointer over any tool, tip text appears giving a short description of the tool.
Tools in toolbars are arranged in the order that you’ll use them in the process of creating and executing
your designed experiment. Depending on where you are in the process of creating an experiment,
Adams/Insight enables or disables the tools (you can always display and undisplay them if you need to).
This feature alerts you to the correct order of procedures to follow. For example, the Run simulations tool
is disabled until you define required elements for a design matrix.
For more information on the toolbars, see the Adams/Insight online help.
54 Adams/Insight
Using Adams/Insight with Adams/View
Overview
This chapter guides you through the process of creating a design matrix and running the model through
a number of simulations that you set up in the experiment.
The sections in this chapter are:
• Creating a Design Matrix
• Running Your Experiment
Promoting Candidates
The first step required to creating your designed experiment is to select the factors that you want to
include in your design matrix. You select factors from the Candidates list in the treeview, and then
promote them to the Inclusions list. Promoting candidates to inclusions causes them to become part of
your design matrix.
Note: The treeview displays the full object hierarchy for each design variable. This tutorial will
only refer to the variable name. For example, the variable hpl_tierod_outer.x appears as
ground.hpl_tierod_outer.x in the treeview.
3. Select the candidate, hpl_tierod_outer.x, and then move your cursor to the Adams/Insight
toolbar and select the Promote to inclusion tool .
Creating and Running an Experiment 55
Using Adams/Insight with Adams/View
The candidate hpl_tierod_outer.x moves to the Inclusion list under Factors in the treeview.
Tip: To select more than one factor, hold the Ctrl key as you click. To promote the factors
directly from the treeview, press the shortcut key F5.
• Type: Continuous
• Delta Type: Relative
• Settings: -5, 5
Promoting Responses
Now that you have finished promoting and modifying your factors, the next step is to promote your
responses for the experiment.
Tip: You can select the minus (-) sign in front of Factors to collapse that section of the treeview
and save screen space.
2. Continue expanding the levels under Candidates and tut_101_aview. Under tut_101_aview,
you’ll see a list of responses that are potential candidates you can include in your design matrix.
3. Select and promote the following responses just as you promoted the factors in step 3.:
• toe_left_REQ
• toe_right_REQ
The responses move from the Candidates to the Inclusion list.
Modifying Responses
The modifications you’ll make to the responses are minor. You’ll add units and change one of the
parameters. To learn more about response parameters, press the F1 key from the Response form.
To modify responses:
1. In the treeview, under Responses, in the Inclusions list, select the response, toe_left_REQ.
58 Adams/Insight
Using Adams/Insight with Adams/View
Note: Output characteristics are grayed out when you use Adams/Insight with Adams/View and
other Adams applications. The output characteristic is set by the originating CAE
application, and is displayed in the Response form for information only.
3. Select Apply.
Adams/Insight saves your response modifications.
4. Select the second response toe_right_REQ, and make the similar modifications as in 2., above.
Note: Selecting the Generate Work Space tool in the Adams/Insight toolbar performs
Steps 4. and 5.
The Work Space appears in the viewport as shown in Figure 25. This table displays the work space matrix
for the full-factorial experiment that you defined above. Adams/View will run a simulation for each trial
defined in this matrix. The column headings are sortable and sizeable. You can also select Work Space
Review to view summary information for each factor and response in your experiment.
In the treeview, at the Design level, the letters D:W appear to indicate that the Design contains a
successfully generated design work space.
Note: Columns appear in the work space matrix in the order that you promote factors for
inclusion.
Working with Results 61
Using Adams/Insight with Adams/View
Tip: Put your mouse pointer over column headings to display key information about the
abbreviation shown.
Note: This procedure builds, runs, and postprocesses all of the simulations within the
Adams/View session. We recommend that you break up the process flow into its separate
phases using the MDI INSIGHT BUILD and MDI INSIGHT LOAD commands. This is
especially important when you have more than 30 trials.
Overview
This chapter guides you through reviewing the results of your analysis, fitting your data to a polynomial
to determine which factors most affect model performance, and publishing results to an HTML or SYLK
file.
The sections in this chapter are:
• Reviewing Results
• Fitting Results
• Optimizing Results
• Publishing Results
62 Adams/Insight
Using Adams/Insight with Adams/View
Reviewing Results
After Adams/View completes the trials defined in your design matrix, you return to the Adams/Insight
interface to view the results.
To return to Adams/Insight:
1. From the Main menu in Adams/View, select Simulate, point to Adams/Insight, and then select
Display.
The Adams/Insight Display dialog box appears with the name of your current experiment.
2. Select OK.
Adams/View undisplays and the Adams/Insight window opens.
Fitting Results
Now that Adams/View has completed the trials defined in your work space matrix, you can use
Adams/Insight to fit your results to a polynomial or a response surface. The purpose of fitting your results
is to establish a relationship between the factors and responses that you selected for the work space
matrix. Fitting results includes a multiple regression. You will be able to investigate the parts of the
regression in the Summary (located in the treeview under Analysis) after completing the following steps.
For more information on this topic, refer to the Adams/Insight online help.
For definitions of the items in the results tables, refer to the online help.
The tables also provide you with a color code that indicates the soundness of your results:
Green indicates that all fit criteria meet or exceed highest fitting thresholds
Yellow indicates that the fit criterion may bear investigation
Red indicates that the fit criterion should be investigated
Optimizing Results
You can perform single-objective and multi-objective optimization using Adams/Insight. Single-
objective optimization involves trying to achieve a target for one scalar response; multi-objective
optimization involves more than one scalar response.
You can optimize your results by:
Working with Results 65
Using Adams/Insight with Adams/View
Adams/Insight updates the responses to reflect the changes you made to the factors. Use the Reset
button to return to the nominal values for each factor. Use Reload to reload all of the optimization
settings.
Note: To save your results, select Write and enter the name of the file to which you want to save.
You can save to a number of different formats, including a .cmd file, which can then be read
back into Adams/View to set the model using the specified factor settings.
2. Select Fixed next to any factor that you don’t want changed during the optimization.
3. Press Run.
Adams/Insight updates the factor values to reflect the changes you made to the responses. Use the
Reload button to return to the nominal values for each factor/response.
Note: To save your results to a text file, select Write and enter the name of the file and file type
to which you want to save.
Publishing Results
Adams/Insight lets you save your results in .html, .slk, .bas (Visual Basic), and .m (MATLAB)
formats. Once saved, you can use other utilities, such as a browser or spreadsheet program, to modify
factors and see the effect on responses without performing full simulations.
3. Enter a name for your file and specify the path where you would like it to reside, and then select
Save.
Adams/Insight saves your file in the directory that you specified.
4. Continue with the next section, Modifying Values Using a Web Browser, to learn how to view and
use the results in the HTML file.
3. Change the value for the first factor hpl_tierod_outer.x from 417 to 420, and then select Update.
The estimated responses adjust to reflect the new factor values. Notice that the value for only one
of the responses, toe_left_REQ, reflects a change. Because the Adams model you’re working
with is an independent suspension, in which the right tie rod is not coupled with the left tie rod,
the changes in the factor values you made only affect the left side of the suspension.
4. You can continue to vary the factor values and investigate how changes to them affect your
responses. To learn more about analyzing the results of your experiment and publishing your
results to HTML or SYLK pages, refer to the Adams/Insight online help.
5. Close your browser window.
6. Exit Adams/Insight.
7. Exit Adams/View.
Using the Monte Carlo Method 69
Using Adams/Insight with Adams/View
Overview
This chapter introduces you to the Monte Carlo method of analysis. The tutorial uses a launch
vehicle/spacecraft separation example to illustrate the mechanics of the solution.
The sections in this chapter are:
• About the Tutorial
• Starting Adams/View
• Creating a Modeling Database
• Running the Simulation
• Starting Adams/Insight
• Creating a Design Matrix
• Running the Experiment
• Reviewing Results
70 Adams/Insight
Using Adams/Insight with Adams/View
Starting Adams/View
The section teaches you how to start Adams/View on UNIX and Windows.
Note: You can skip this step if you previously used the Help‡Copy Examples To feature to copy
all of the tutorials for Adams/Insight. Your working directory is ain_examples/aview.
2. Type the command to start the Adams Toolbar at the command prompt, and then press Enter.
3. Select the Adams/View tool .
The Adams/View main window appears.
Using the Monte Carlo Method 71
Using Adams/Insight with Adams/View
Note: You can skip this step if you previously used the Help‡Copy Examples To feature to copy
all of the tutorials for Adams/Insight. Your working directory is ain_examples/aview.
2. From the Start menu, point to Programs, point to MSC.Software, point to MD Adams 2010,
point to Aview, and then select Adams - View.
The Adams/View main window appears.
6. Zoom in on the top portion of the vehicle. Note the four forces between the adapter frustum and
the spacecraft (they’re circled in the following figure).
Identifying Measures
Here, you will identify the measures in the model.
Starting Adams/Insight
In this section, you’ll open Adams/Insight from Adams/View and begin creating an experiment to
measure the performance of a launch vehicle model.
3. Select OK.
74 Adams/Insight
Using Adams/Insight with Adams/View
In the treeview of Adams/Insight, note that the model has eight factors and four responses.
Promoting Candidates
The first step required to creating your designed experiment is to select the factors that you want to
include in your design matrix. You select factors from the Candidates list in the treeview, and then
promote them to the Inclusions list. Promoting candidates to inclusions causes them to become part of
your design matrix.
Tip: To promote the factors directly from the treeview, press the shortcut key F5.
Tip: You can select the minus (-) sign in front of Factors to collapse that section of the treeview
and save screen space.
Using the Monte Carlo Method 75
Using Adams/Insight with Adams/View
6. Continue expanding the levels under Candidates and separation. Under separation, you’ll see
a list of responses that are potential candidates you can include in your design matrix.
7. Select and promote all of the responses just as you promoted the factors in 3..
The responses move from the Candidates to the Inclusion list as shown in Figure 32.
Modifying Factors
After you promote your factors, you define parameters for them in the Factor form. To learn more about
factor parameters, press the F1 key from the Factor form.
3. Select Apply.
4. To create the work space, select the Generate Work Space tool .
The Work Space appears in the viewport. Note that the response columns are empty.
5. From the treeview, under Design, select Work Space Review.
78 Adams/Insight
Using Adams/Insight with Adams/View
Note: This procedure builds, runs, and postprocesses all of the simulations within the
Adams/View session. We recommend that you break up the process flow into its separate
phases using the MDI INSIGHT BUILD and MDI INSIGHT LOAD commands. This is
especially important when you have more than 30 trials.
Reviewing Results
After Adams/View completes the trials defined in your design matrix, you return to the Adams/Insight
interface to view the results.
To return to Adams/Insight:
1. From the Main menu in Adams/View, point to Simulate, point to Adams/Insight, and then select
Display.
The Adams/Insight Display dialog box appears.
2. Verify the name of your current experiment, and then select OK.
The Adams/Insight window replaces the Adams/View window.
For most experiments, you can select both cache options. Adams/Insight
stores intermediate results during the refinement and, in most cases, will
greatly speed up the refinement process. The data storage does take
computer memory, however, so it is possible that for some very large models
it will be necessary to turn one or both off. If it is necessary to turn one option
off, turn off the Term caching as it takes the most memory.
Monitor
Messages Select the type of messages you want to view during the refinement process.
4 Adams/Insight
Automatically Remove Outliers
Allows you to remove individual trials from the fit. Learn about Refinement.
To select multiple trials, hold down the Ctrl key while selecting.
Select All Select to highlight (and remove) all regressions listed.
Clear All Select to unselect (clear) all selected regressions.
Filter Using Select the criteria used to search for trials.
• Actual.
• Estimate.
• Raw Residual. See Residuals.
• Studentized. See Studentized Residuals.
• Cook's. See Cook’s Statistics.
Exclude Runs Select a filter type for excluding trials. Enter the limits below.
Upper Limit Enter the upper limit for the filter.
Lower Limit Enter the lower limit for the filter.
Apply each exclusion to Select one of the following:
• Only the regression with the outlier: The filtered trials will
only be removed from the corresponding regressions.
• All regressions: The filtered trials will be removed from all
regressions.
Argument Description
-bg R G B Window/background color, 0 <= RGB <= 255
-deskcolor Use desktop color for window/background
-diag [item] Runs one of the specified application installation diagnostics. Use -diag -help to
display available options.
-e file Specifies experiment file
-gtdiff F1 F2 Graphically displays the difference between two text files (primarily used for
Adams/Insight ASC).
-reuse file Reuse experiment settings from this experiment file
-experimental Turn on experimental features
-h Display this text and exit
-splash icon Splash image file (.bmp, .png, .xpm)
-subprocess Being called from another application
-v Turn on verbose messages
-wide Start Adams/Insight in a wider window
file Specifies experiment file
6 Adams/Insight
Design Inclusion
Design Inclusion
Adams/Insight enables you to import a full or partial design matrix whose factor settings will be included
when the complete workspace is generated. Referencing an inclusion matrix is only applicable for D-
Optimal design types.
Dialog Box - F1 Help 7
Design Space
Design Space
This table displays the Design Space (factors) for each Trial in your experiment. The column headings
are sortable and sizeable. You can also select Work Space Review to view summary information for each
factor and response in your experiment. See Design Work Space Review for more information.
Columns appear in the design space matrix in the order that you promoted factors for inclusion.
Put your mouse pointer over column headings to display key information about the abbreviation shown.
8 Adams/Insight
Design Specification
Design Specification
Defines the design of your experiment.
• Study - Perimeter
• Study - Sweep
• DOE Screening (2 Level)
• DOE Response Surface
• Variation - Monte Carlo
• Variation - Latin Hypercube
Learn more about Investigation Strategies.
Model Select one of the following:
• Linear
• Interactions
• Quadratic
• Cubic
• None. See None Option.
DOE Design Type Select one of the following:
• Plackett-Burman
• Fractional Factorial
• Full Factorial
• Box Behnken
• CCF (Central Composite Faced)
• D-Optimal
• Latin Hypercube
Learn more about DOE Design Types.
Dialog Box - F1 Help 9
Design Specification
• All - Uses all of the candidate runs that are in a full factorial design
for a given collection of factors (potentially a very large number).
• Random - Limits the number of Candidates, thus reducing the run
time of the D-Optimal algorithm. If you choose Random, enter a
value for Number of Candidate Runs.
Number of Runs Indicates a numeric value of unique Trials (rows) in the Design Space and
Work Space.
• All - Uses all of the candidate runs that are in a full factorial design
for a given collection of factors (potentially a very large number).
• Random - Limits the number of Candidates, thus reducing the run
time of the D-Optimal algorithm. If you choose Random, enter a
value for Number of Candidate Runs.
10 Adams/Insight
Design Specification
• Standard - You can use this option if you are running an analytical
Design of Experiment (DOE), and do not expect the order of the runs
to have a significant effect on the results.
• Random - This is generally the run order to use for physical DOEs.
For example, if your response varies depending on when you
measure it during the course of a day, you should randomize the run
order in order to capture the overall behavior of the system.
• Ease of Adjustment - This option is also more applicable for
physical DOEs. It affects the Work Space and Design Space matrix
when you set a Factor attribute to Ease of Adjustment.
Dialog Box - F1 Help 11
Design Work Space
Factor Form
(Treeview) -> Factors -> Inclusions/Candidates -> Factor name
You can also use this text box to specify values for your user-defined
distribution.
Distribution Profile Displays an graphical representation of the selected distribution.
Variation Details Select to display the details of the distribution, including upper/lower limits
and mean value.
Tie Tab
Tie Type Enter one of the following:
• Scale
• Offset
After choosing the tie type, select Apply to make it effective. Learn more
about Tying Factors.
Tied Factors table Depending on the value entered in Tie Type, specify the Scale or Offset for
each factor. See Examples.
Displays a summary of all of the factors included in your current experiment. Learn more about Factors.
Dialog Box - F1 Help 19
Factors Table (All)
Displays a summary of all of the factors in your experiment. Learn more about Factors.
20 Adams/Insight
File Export Matrices
Enables you to select various formats to which to export the regression model.
You can select the following:
• .htm (interactive .htm file utilizing JavaScript)
• .slk (spreadsheet neutral file able to be read into spreadsheet programs such as Excel)
• .bas (Visual Basic set of functions, which can be used in Excel)
File Open
File -> Open
File Save As
File -> Save As
Find Directory
File -> Select Directory
To select a directory:
1. Enter the name of the directory you want to use.
2. Select OK.
26 Adams/Insight
Import Experiment Factors/Responses
Model Properties
This form allows you to annotate and investigate aspects of the current regression models. Learn about
Refinement of a Fit.
Note: These annotations are stored in the experiment file, but are not
exported to the Web page or the SYLK files.
Regression Select one of the following:
Note: The items displayed in this form can help you in reviewing the quality of fit. There is no
default option to print the tables or charts. If you want to make hardcopies of these values,
use a screen capture tool or cut and paste to get the contents of the tables.
28 Adams/Insight
Optimization Preferences
Optimization Preferences
Tools->Optimize Model->Preferences
Optimization Write
Tools -> Optimize Model -> Write
Defines the file to which to save your optimization results. Learn more about Optimization.
30 Adams/Insight
Optimize Model or Experiment
Adams/Insight allows you to optimize your fitted models. With some simulation conduits, Adams/Insight
also allows you to directly optimize your experiment using simulations. During optimization,
Adams/Insight automatically adjusts the factor values so that the resulting responses come as closely as
possible to the specified target values.
You may treat each response as an objective or as a constraint. A response is an objective if you attempt
to maximize or minimize that value. A response is a constraint if you want to keep it fixed to a desired
value or within a range of desired values.
Adams/Insight performs both single-objective and multi-objective optimization. Adams/Insight
computes a cost for each objective based on the objective option, target value, and weighting factor.
Adams/Insight combines the individual costs into one overall cost based on your choice of the multi-
objective method. Adams/Insight then attempts to minimize the overall cost.
Learn more about Optimization.
Note: Moving the slider will dynamically update the response values if
you are optimizing a fitted model and the regression model has
less than 50 responses. If you are optimizing an experiment or the
regression model has more than 50 responses, position the
respective sliders to the desired factor setting and select the
Update button.
(Factor) Maximum Modify the maximum value for this factor. You can reduce the maximum
value to reduce the range of possible factor values. You cannot increase the
maximum greater then the initial value.
(Factor) Value Modify the current value for this factor, which will be the initial value for the
next optimization, or you can use the Update button to directly compute the
responses using this value.
Fixed Check this box if you do not want a specific factor changed during the
optimization.
Dialog Box - F1 Help 31
Optimize Model or Experiment
Note: The arrow in the slider area identifies the current value.
(Response) Maximum Displays the maximum value for this response. This is the ideal maximum,
ignoring other responses.
(Response) Value Displays the last computed value for this response. Adams/Insight computes
new response values when you press the Run or Update button.
32 Adams/Insight
Optimize Model or Experiment
Preferences
Edit -> Preferences
If you use the same nonzero random number seed each time, Adams/Insight
will generate the same series of random numbers. For example, this means
that you can recreate the Work Space for a Variation experiment (as long as
you have the same factors and number of Trials).
If you set the seed to zero, Adams/Insight will use a seed based on the
current time for each sequence. This ensures that Adams/Insight will
always generate a different sequence, but you will not be able to reproduce
the sequence.
The random number seed is also used for randomizing the design order and
selecting random trials for Dopt, if you select those options.
Fit
Significance (CI) Enter a value used to compute the coefficient Confidence Interval in the
Terms of Regression dialog box. The default confidence interval is 95%
(0.05). The pop-up help on the '+/-' column header in the Terms of
Regression dialog box shows the % confidence.
Dialog Box - F1 Help 35
Preferences
For example, if you enter .95 as the first value and .8 as the second value,
the following happens:
Allows you to specify the model order for your experiment. Learn about Refinement.
• Linear
• Interactions
• Quadratic
• Cubic
Dialog Box - F1 Help 37
Refinement - Remove Outliers
Allows you to select Outliers to remove from your experiment. Learn about Refinement.
Allows you to select terms to remove from your experiment. Learn about Refinement.
Before fitting the regression model, you can transform factors. The most common type of transformation
is an orthogonal transformation. The transformation is based on the range of values for the factor, such
that the transformed value ranges from -1 to 1. In some instances, a transformation may also be applied
on the response.
For example, if a factor, F1, has values ranging from 80 to 120, the transformed factor, say, TF1, has a
range of -1 to 1.
Learn about Refinement.
• None
• Square
• Square root
• Inverse
• Inverse of square root
• Log base 10
40 Adams/Insight
Response
Response
Defines parameters for your responses. Learn more about Responses.
• Scalar
• Composite
For More information see Response Types.
Description Tab
Description Add a description to a response. The description appears in your results when
you export them to an SYLK or HTML report.
URL (Universal Add a URL (Web address) to link to a factor. If you publish your experiment
Resource Locator) results to an HTML page, you can use this option to link the factor to that page.
A fully qualified URL (identifying protocol, server and specification) is
recommended. For example http://support.adams.com/kb/csearch.asp. An
easy way to get this string is to simply cut a valid URL from your browser
location window.
Variable When using Adams/Insight in stand-alone mode, this text box is optional.
When using Adams/Insight in conjunction with an Adams modeling
application, this text box contains the name of the design objective, or refers
to the Fetch class that retrieves the results values.
Displays a summary of all of the responses not included in your current experiment. Learn more about
Responses.
Dialog Box - F1 Help 43
Responses Inclusions Table (All)
Displays a summary of all of the responses in your experiment. Learn more about Responses.
44 Adams/Insight
Responses Table (All)
Displays a summary of all of the responses in your experiment. Learn more about Responses.
Dialog Box - F1 Help 45
Reuse
Reuse
File -> Reuse -> Factors, Responses, Specifications, All
Enables you to reference an existing experiment to include select factors, responses, specifications, or all
elements of an existing experiment to the current experiment. More on Reusing Components.
Simulation Properties
(Treeview) -> Simulation -> (Product name)
Solver Settings
Tools -> Optimize Model -> Preferences -> Solver Settings
Defines settings for the optimization solver. The values in this dialog box vary based on the Optimization
Solver setting in the Optimization Preferences dialog box.
Treeview
Displays a hierarchical list of objects that you can include in an experiment. The tree is especially useful
in selecting and identifying objects when you are creating a design matrix. Learn About the Toolbars.
What the Plus and Minus sign mean.
Dialog Box - F1 Help 49
Work Space Column Calculator
The Workspace column calculator enables you to perform mathematical operations on the columns of an
existing workspace. Learn more about the Work Space Column Calculator.
Note: Use the factor abbreviation in the expression, not the full factor name.
The standard math and random Python module functions are available
in an expression. See References for more information on Python
syntax.
50 Adams/Insight
Work Space Correlation
Allows you to measure the potential strength of a relationship, or lack of relationship, between two
variables. Learn more about Work Space Correlations.
• Pearson
• Spearman rank
Display Select one of the following:
• All - Displays all of the factors and responses along each axis of the
grid. Note that the diagonal of the grid will always contain a value of
one.
• Factors vs. Responses Only - Displays the inputs (factors) along the
rows and the outputs (responses) along the columns.
Correlation Details Select a cell in the table and then select Correlation Details to display addition
information on the factor/response combination that was selected.
Highlight values Enter a value above which correlations will be displayed. For example, entering
larger than .8 will display all correlations with a value of .8 or above.
Green icons are displayed in all cells of the grid where the correlation is above
this value.
Note: If you set this value to 1.0, the icons are not displayed.
Dialog Box - F1 Help 51
Work Space Correlation
ANOVA
ANOVA stands for Analysis of Variance which is a statistical method for breaking down the total
variability in a dataset into components attributable to various sources.
Appendix 3
Adams/Insight Toolbars
Adams/Insight Toolbars
The tools in the toolbars are arranged in the order that you use them in the process of creating and
executing your designed experiment. Depending on where you are in the process of creating an
experiment, Adams/Insight enables or disables the tools. This feature alerts you to the correct order of
procedures to follow. For example, the Run simulations tool is disabled until you define required
elements for a design matrix.
4 Adams/Insight
Adjusted R-Squared
Adjusted R-Squared
The R-Squared value always increases when a term is added to the model, irrespective of whether or not
the new model is better than the previous one. With every additional term in the model, the residual
Degrees of Freedom (DOF) are reduced by one. Therefore, unless the error Sum of Squares (SS) of the
new model is reduced by an amount greater than the previous error Mean Squares (MS), the new model
will have a larger error mean square and is therefore not a better model. To overcome this deficiency, the
adjusted R-squared value is based on the ratio of model mean square to total mean square.
Values: The adjusted R-squared value is typically smaller than the R-squared value for a given
regression model. When the R-squared value is very small, the adjusted R-squared can be a negative
number.
Troubleshooting: If the R-squared value is fairly high but the adjusted R-squared value is low, it
indicates that some of the terms in the model are not very useful. It also indicates that all the variability
in the response data has not been explained by the model. You should examine the terms and make a
decision about dropping some terms and adding some new ones. The decision about which terms to drop
can be made by looking for terms with low T-values.
Appendix 5
Beta (standardized coefficient)
Note: If the data is both centered and scaled, the constant term is zero in standardized
coefficients.
6 Adams/Insight
Candidates
Candidates
Candidates are the potential members in a group as compared to the Inclusions. In Adams/Insight
candidate factors or candidate responses are the possible factors or response you have to choose from for
the experiment. You promote a candidate factor or response to the Inclusions list.
In the graphical user interface (GUI) you can select a candidate factor or response with the mouse, then
select the promote button on the toolbar (or press the F5 key). You can select multiple candidates to
promote by using the Ctrl key with the left mouse button. To demote a factor or a response, use the
Demote button or the F6 key.
Appendix 7
Confidence Interval
Confidence Interval
The confidence interval, or confidence limits, measure uncertainty in an estimate. The confidence limits
about the estimate contain the true value, with a specified level of confidence. For example, 95%
confidence limits around the estimate of a regression coefficient mean that there is a 95% chance that the
true value of the regression coefficient is within those limits.
8 Adams/Insight
Cook’s Statistics
Cook’s Statistics
Cook's statistics measure the influence of each run on the fit. Values are usually between zero and 1, but
can be larger. A larger value indicates that a run has more effect on the final fit than a run with a smaller
value. Values larger than most of the others, for example larger than .5 or especially larger than 1, suggest
that the run is unusually influential. The run might be an outlier, or at least should be examined to make
sure it is accurate.
References
• DS - Applied Regression Analysis, Draper and Smith (pg 210)
• MM - Response Surface Methodology - Process and Product Optimization Using Designed
Experiments, Myers and Montgomery (pg 49)
Appendix 9
Degrees of Freedom (DOF)
Design Variable
A design variable is considered an input or factor for a Design of Experiment (DOE). When running a
DOE in Adams/View, the only input or factor which you can use is an Adams/View design variable. If
using Adams/Insight in conjunction with Adams/View, there are three possible object types which are
used as inputs to the DOE. The x, y, and z values of a point, certain attributes of a user-defined element
and traditional Adams/View DOE design variables.
Appendix 11
Design of Experiment (DOE)
F-Ratio
The F-ratio is used in the regression ANOVA to test the Significance of the regression. The F-ratio is
computed as the ratio of the mean square variation due to the regression model (MSM or MSR) to the
mean square variation due to error (MSE). The F-ratio is compared to an F-distribution to test the
hypothesis that all coefficients are zero. High values for the F-ratio will lead to the rejection of this
hypothesis and, therefore, suggest that the regression is significant and the model is useful.
Appendix 13
Fit Table
Fit Table
The fit table displays these quantities for the fit:
• Degrees of Freedom (DOF)
• Sum of Squares (SS)
• Mean Squares (MS)
• F-Ratio
• Regression Significance (P)
14 Adams/Insight
Goodness of Fit
Goodness of Fit
The goodness-of-fit summary displays some of the primary statistics to view when assessing the
goodness of fit. This includes R-Squared, Adjusted R-Squared, Regression Significance (P), and Range-
to-variance ratio.
Appendix 15
HTML Web Page Example
Inclusions
Inclusions are the actual members in a group to be investigated as compared to the Candidates. In
Adams/Insight, inclusion factors or Inclusion responses are the actual factors or responses you will be
using in the experiment. You promote a candidate factor or response to the Inclusions list or you can
demote an inclusion member to the Candidates list.
In the graphical user interface (GUI) you can select an inclusion factor or response with the mouse, then
select the Demote button on the toolbar (or press the F6 key). You can select multiple inclusions to
demote by using the Ctrl key with the left mouse button.
Appendix 17
Interaction
Interaction
When the effect that a factor has on a response depends on the value of a different factor, the two factors
have an interaction. Interaction effects are captured through special terms in the model that consist of
products of factors. Interaction effects are important because in the presence of strong interaction, the
main effects of the factors may be misleading. Learn more about Main Effects.
18 Adams/Insight
Least Squares Method
Level
Levels are the number of possible values that can be taken by a factor.
Values: In most screening designs the factors have 2 levels. In RSM designs they typically have 3 levels.
It is also possible to have some factors at 2 levels and others at 3 or more levels.
Main Effects
Main effect refers to the primary effect of a factor. A good way to examine the main effects is through a
Pareto chart.
The Adams/Insight .htm file computes main effects on the fly using JavaScript.
The displayed main effect of a factor is the difference between the response at the factor maximum value
and the response at the factor minimum value, while all other factors are at their average values. Effects
may be positive (response increases with larger factor value) or negative (response decreases with larger
response value).
Note that the minimum and maximum factors' values do not necessarily produce the minimum and
maximum response values. If a response is highly nonlinear over the factor value range, the minimum
and/or maximum response values may be in the middle of the curve. In this case, the main effects values
are meaningless.
The effect % is the ratio of the effect value to the response value with all factors at their average values.
An effect % greater than 100% means that the variation in the response value is larger than the average
response value.
The effects are sorted largest to least absolute value. The longest bar is always the same length. The other
bars are proportional to the largest based on the effect value relative to the largest value. Positive effects
have a dark blue bar, negative effects have a light blue bar.
Appendix 21
Mean Squares (MS)
Model
In performing a regression analysis, the objective is to fit an equation (referred to as the model) to the
data such that the error between the values predicted by the equation and the actual observed values is
minimized. The model can have a constant Term, linear terms, and nonlinear terms.
(Linear model R = a1 + a2F1 + a3F2 + e)
(Interaction model R = a1 + a2F1 + a3F2 + a4F1F2 + e)
(Quadratic model R = a1 + a2F1 + a3F2 + a4F1F2 + a5F1^2 + a6F2^2 + e)
Appendix 23
Multi-Objective Optimization
Multi-Objective Optimization
If you have more than one response set to Min or Max, then you are performing a multi-objective
optimization.
You use the target and weight values to adjust the relative importance of the responses. Based on these,
Adams/Insight computes a cost for each response and combines these into an overall cost. Adams/Insight
then minimizes the overall cost. The response cost is the difference between the response value and the
target value multiplied by the weight. The target value acts as the desired value, and the weight scales the
response relative to other responses. The weight should reflect both different units or scales among the
responses, as well as increased or decreased importance.
You can optimize either Total Cost or Worst Cost. You can select Total Cost or Worst Cost using the
Settings button. If you select Total Cost, Adams/Insight minimizes the sum of the response costs. If you
select Worst Cost, Adams/Insight minimizes the worst (largest) cost among the responses.
24 Adams/Insight
None Option
None Option
Under Model, the none option is automatically selected when it is inappropriate to attempt to fit results
based on the investigation strategy selected. For example, a Perimeter study will automatically force a
None option. Certain variations of the sweep study will also force a none selection.
Appendix 25
Outlier
Outlier
An outlier is a data point that does not seem to fit with the others, and perhaps should be fixed or removed
from the fit. A simple case is data that nicely follows a straight line, except for one point in the middle
that lies far off the line. Often, this is the result of an anomaly or unexpected situation in the case that
generated the data. In Adams, this might be a combination of variable values that leads to completely
different model behavior, such as a part missing a stop or a linkage locking up. You can find outliers by
examining Residuals and Cook’s Statistics for each run.
Troubleshooting: When running analytical Design of Experiment (DOE)s, make sure that all the Trials
ran successfully. Often disk space limitations or a license server dropping off line for a few seconds can
cause an entire block of runs to be missing from the results.
26 Adams/Insight
Pareto
Pareto
A Pareto diagram is a ranking of most significant to least significant, and then displaying the results with
a bar graph. Adams/Insight optionally displays the Main Effects of the fit in the published Web page.
Appendix 27
Plus and Minus sign
Properties
The Regression Summary Properties window allows you to alter the name of the current model or add
comments. This can be helpful if you're assessing different models from the same experiment.
Appendix 29
R-Squared
R-Squared
An R-squared value is the proportion of total variability in the data which is explained by the regression
model. It is computed as the regression or model Sum of Squares (SS) divided by the total sum of
squares.
Values: Range is 0 to 1.
Troubleshooting: An R-squared of 1 indicates a perfect model. This is unlikely and may be due to the
number of terms being the same as the number of data points. Check the number of Error Degrees of
Freedom in the fit for regression response summary area. Generally, the more Error Degrees of Freedom
a model has, the better you can quantify the fit. You should add a few extra runs and then fit the model.
An R-squared of 0 indicates that the data is purely random or that the model is totally inappropriate. You
should check the range of response values to make sure that they make physical sense. Ideally, you should
obtain R-squared values greater than 0.9 for high confidence in the results.
30 Adams/Insight
RMS Error
RMS Error
RMS error or Root Mean Square error is an estimate of the unexplained variability remaining after a
model has been used to fit the data. If the model is good, RMS error should be small compared with the
mean value of the response.
Values: Theoretically, the smallest value of RMS Error is zero. However, this implies a perfect fit which
is unlikely and should, therefore, be suspect. In general, values which are two orders of magnitude
smaller than the mean value of the response are good.
Troubleshooting: If the R-Squared values are very good and the RMS Error is large, it indicates that the
model is reasonably good but there is a lot of variability in the data. For physical experiments, it may be
useful to check pure repeatability. If both R-squared values and RMS Error values are poor, it is advisable
to check the validity of the model and the data.
Appendix 31
Range-to-variance ratio
Range-to-variance ratio
The range-to-variance ratio measures how well a fitted regression model might predict new values. It is
defined as the range of estimated values at the original data points, divided by the average variance of the
estimated values. It measures the variation of predicted values due to the model, versus the variation due
to uncertainty in the model. A high value, greater than 10 for example, indicates that the prediction is
likely worthwhile. A low value, less than 4 for example, suggests that the uncertainty in the model is high
enough that predicted changes may not be significant.
32 Adams/Insight
Regression Significance
Regression Significance
Regression significance is defined as the probability that the regression coefficients are all zero. In other
words, the regression model has no useful terms.
A low value of Significance, .02 for example, means that it is likely that at least one term in the model is
related to the response. A high value of significance, .3 for example, means that there is a high probability
that none of the terms in the model is related to the response. A low regression significance only means
that at least one term is likely significant, not necessarily all terms. Look at the Term Significance values
to check individual terms.
The regression significance is computed from the regression F-Ratio.
Appendix 33
Residuals
Residuals
Residual is the difference between the predicted (estimate) and observed (actual) values of the response.
Troubleshooting: Residual plots are useful to examine when troubleshooting your model. Examples of
such plots are a plot of the residuals versus the run number, or residuals versus a response or factor value.
Any trends that are observed indicate an effect that has not been properly captured in the experiment.
Residual plots can be based on either raw values or values that are Studentized. If a few runs have very
high residuals, you should examine the runs to check the validity of the data.
34 Adams/Insight
Residuals Table
Residuals Table
The residuals table displays these quantities for each trial:
• Actual response
• Estimated response
• Raw residual. See Residuals.
• Studentized residual. See Studentized Residuals.
• Cook's statistic. See Cook’s Statistics.
Appendix 35
Rules Summary
Rules Summary
Adams/Insight uses a number of rules-of-thumb to help you evaluate regression results. The Rule-of-
thumb summary table summarizes the results of these rules. For each response in the experiment, the
summary displays the worst case among the Goodness of Fit rules, the Term Significance, Studentized
Residuals, and Cook’s Statistics.
Default Thresholds:
Quantity
R2 0.95 0.8
R2adj 0.90 0.7
reg. P 0.01 0.05
Range/Variance 10 4
term P 0.01 0.05
abs(cooks) 0.5 1.0
abs(studentized) 3.0 4.0
The colored Icons (green, yellow with question mark, red with cross), help identify between which
threshold the particular measure falls.
More information on threshold Preferences.
36 Adams/Insight
Significance
Significance
Significance is the probability that a value at least as extreme as the value of the statistic being tested
could occur by random chance. Adams/Insight reports two types of significance values: one for the
regression as a whole (Regression Significance), and a value for each term (Term Significance).
Values: Since it is a probability, significance values range from 0 to 1. Low values (0.1 and smaller)
signify important terms and useful regressions.
Appendix 37
Single-Objective Optimization
Single-Objective Optimization
If you have only one response, or if you have only one response set to achieve an objective, then you are
performing a single-objective optimization. In this case, Adams/Insight will adjust the factors to try and
meet the objective of the single response. The weight will not affect the resulting factor values.
38 Adams/Insight
Sorting
Sorting
For proper sorting to occur, you must set the Run Order option to Ease of Adjustment in the Design
Specification dialog box. Ease of Adjustment is generally not applicable when running analytical Design
of Experiment (DOE)s. This option is primarily used when you’re using Adams/Insight in stand-alone
mode. It lets you specify a relative expense to modify one factor compared to another when running a
physical DOE.
Appendix 39
Standard Deviation
Standard Deviation
The standard deviation of a random variable is defined as the positive square root of the Variance.
Standard deviation is a measure of the variability of the variable about the mean. If a variable is normally-
distributed (in other words, its distribution follows the standard bell-shaped curve), 68.3% of the time its
value will fall within one standard deviation of the mean, 95.4% within two standard deviations, and
99.7% within 3 standard deviations.
40 Adams/Insight
Standard Error
Standard Error
For any estimate, the standard error represents the variability in that estimate. In a regression model, the
standard error of a coefficient is an estimate of the Standard Deviation that would be obtained by
repeatedly estimating the coefficient with new data.
Appendix 41
Studentized Residuals
Studentized Residuals
Studentized residuals are residual values that are scaled to make them independent of the magnitude of
the actual Residuals. This makes it easier to identify large errors in the estimates. Studentized residuals
always have a Variance and Standard Deviation of 1. If the fitted model is correct, and basic assumptions
about errors are true, then the residuals should be normally-distributed. Therefore, for a good model
almost all studentized residuals should be between -3 and 3, with most between -2 and 2, and about 2/3
between -1 and 1.
If most Studentized residuals fall within these guidelines, but one or two runs stand out as poor, it may
be that those runs are Outliers which should be corrected or removed. If many Studentized residuals fall
outside these guidelines, then the model may not be accurate and may need more terms or a smaller range
of factor values.
Note: If the fit is exact, there is no residual and therefore no standard error for the residual, so the
studentized residual is undefined. The Cook's statistic is similar (See Cook’s Statistics). In
Adams/Insight if the absolute value of the raw residual is < 1e-12, it is considered an exact
fit and the Cook's and studentized are set to zero.
Troubleshooting: If the residuals are ~1e-10, then the regression is more-or-less an exact fit and many
of the measures become undefined and/or lose their meaning.
References:
• DS - Applied Regression Analysis, Draper and Smith (pg 207)
• MM - Response Surface Methodology Process and Product Optimization Using Designed
Experiments, Myers and Montgomery (pg 45)
• ri = ei / (s2 (1-hii))1/2
• ri = ith studentized residual
• ei = ith residual
• s2 = estimate of error variance; that is, the residual mean square (MSE) from the ANOVA table
• hii = ith entry on hat matrix diagonal, hat matrix H = X(X'X)-1X'
The denominator is the standard error of the ith residual, so the studentized residual is the raw residual
normalized by dividing by its Standard Error. This is also called the internally studentized residual. There
is a variation called the externally studentized residual.
42 Adams/Insight
Sum of Squares (SS)
T-value
The T-value is a statistic which is used to test a hypothesis by comparing it to a T-distribution. In
regression, it is used to determine whether or not a term in the model is significant (See Significance). An
assumption is made that the underlying distribution is normal. Then a T-value is calculated under the
hypothesis that the true value of the coefficient for that term is zero. If the T-value is large, this hypothesis
is rejected because it indicates that the true value is not zero and that the term is indeed significant.
Values: T-values can be either positive or negative.
44 Adams/Insight
Term
Term
The equation that is used to fit the data consists of various terms. The terms can be of type linear,
interactions, quadratic, or cubic.
For example, let F1 and F2 be two factors and R1 the response. The regression equation for R1 can take
the form:
R1 = C0 + C1*F1 + C2*F2 + C3*F1*F2 + C4*F1*F1
Here, the Cs are coefficients:
• C1*F1 and C2*F2 are the linear terms
• C3*F1*F2 is an interactions term
• C4*F1*F1 is a quadratic term
Appendix 45
Term Significance
Term Significance
Term significance is defined as the probability that the term coefficient is zero. In other words, that the
term does not affect the response. The term significance is computed from the term T-value.
For instance, let the T-value in a T-test for a term in a model be 11.0. Significance then gives us the
probability of the T-value being as high as 11.0 under the assumption that the true value of the coefficient
for that term is zero. To identify terms that have a significant effect on the regression, look for low
significance values.
46 Adams/Insight
Terms Table
Terms Table
The terms table displays these quantities for each term in the fitted polynomial:
• Coefficient
• Confidence Interval
• Standard Error
• Beta (standardized coefficient)
• T statistic
• Term Significance
• Term definition (Term)
Appendix 47
Trial
Trial
A trial is a single run from the total number of runs that together make up the experiment. Each row of
the Work Space matrix represents a trial or run. With each run, the inputs (factors) are modified and the
output (response) is monitored and recorded.
48 Adams/Insight
Variance
Variance
The variance of a random variable is defined as the expected (average) value of the squared difference
from the mean. Variance is a measure of the variability of the variable about the mean. The positive
square root of the variance is the Standard Deviation.
The range-to-variance ratio measures how well a fitted regression model might predict new values. It's
defined as the range of estimated values at the original data points, divided by the average variance of the
estimated values. It measures the variation of predicted values due to the model versus the variation due
to uncertainty in the model. A high value, greater than 10 for example, indicates that the prediction is
likely worthwhile. A low value, less than four for example, suggests that the uncertainty in the model is
high enough that predicted changes may not be significant.
Appendix 49
Work Space Column Calculator Example