Validation and verification of mathematical modelsQA techniques for optimization problem codingHow to decide to write an objective function?Usages of logarithmic mean in optimizationGuidelines for Linear Optimization approaches?How to determine the correct level of detail when modelling?As an Operations Research professional, how is your time divided when working on an optimization project?Minimizing a project costs through nonlinear optimizationOptimization models for portfolio optimizationStructural OptimizationAre valid inequalities worth the effort given modern solvers?Model Update for Data Driven Real Time Process Optimization

Stuffing in the middle

Use of vor in this sentence

Are there reliable, formulaic ways to form chords on the guitar?

Do predators tend to have vertical slit pupils versus horizontal for prey animals?

Sleeping solo in a double sleeping bag

What happened after the end of the Truman Show?

Did the twin engined Lazair ultralight have a throttle for each engine?

What is the evidence on the danger of feeding whole blueberries and grapes to infants and toddlers?

Earliest evidence of objects intended for future archaeologists?

Is it safe to reuse the password when using AES-CTR with scrypt?

Can a Beast Master ranger choose a swarm as an animal companion?

Infinite loop in CURSOR

Writing/buying Seforim rather than Sefer Torah

How to avoid using System.String with Rfc2898DeriveBytes in C#

Do living authors still get paid royalties for their old work?

Unbiased estimator of exponential of measure of a set?

!I!n!s!e!r!t! !n!b!e!t!w!e!e!n!

Why do some academic journals requires a separate "summary" paragraph in addition to an abstract?

How did Apollo 15's depressurization work?

Does git delete empty folders?

Why don't sharp and flat root note chords seem to be present in much guitar music?

Metal that glows when near pieces of itself

Is there a commercial liquid with refractive index greater than n=2?

Moons that can't see each other



Validation and verification of mathematical models


QA techniques for optimization problem codingHow to decide to write an objective function?Usages of logarithmic mean in optimizationGuidelines for Linear Optimization approaches?How to determine the correct level of detail when modelling?As an Operations Research professional, how is your time divided when working on an optimization project?Minimizing a project costs through nonlinear optimizationOptimization models for portfolio optimizationStructural OptimizationAre valid inequalities worth the effort given modern solvers?Model Update for Data Driven Real Time Process Optimization













21












$begingroup$


Within the subject of simulation I have found some literature on validation and verification (e.g. Sargent's paper). My question is, what techniques do you use to validate and verify your mathematical (optimization) models and their implementation?










share|improve this question











$endgroup$


















    21












    $begingroup$


    Within the subject of simulation I have found some literature on validation and verification (e.g. Sargent's paper). My question is, what techniques do you use to validate and verify your mathematical (optimization) models and their implementation?










    share|improve this question











    $endgroup$
















      21












      21








      21


      4



      $begingroup$


      Within the subject of simulation I have found some literature on validation and verification (e.g. Sargent's paper). My question is, what techniques do you use to validate and verify your mathematical (optimization) models and their implementation?










      share|improve this question











      $endgroup$




      Within the subject of simulation I have found some literature on validation and verification (e.g. Sargent's paper). My question is, what techniques do you use to validate and verify your mathematical (optimization) models and their implementation?







      optimization modeling simulation






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Aug 7 at 18:35









      Kevin Dalmeijer

      2,0445 silver badges25 bronze badges




      2,0445 silver badges25 bronze badges










      asked Aug 7 at 11:02









      DjamesDjames

      1886 bronze badges




      1886 bronze badges























          5 Answers
          5






          active

          oldest

          votes


















          19












          $begingroup$

          Generally speaking, verification refers to evaluating the conformation of a system, product, or service with its intended requirements and design specifications. On the other hand, validation refers to evaluate how much that system, product, or service is in accordance with the customers' or stakeholders' expectations.



          AFAIK, there is no concrete literature on verification and validation of optimization models. Over the years, I gathered some useful points and methods based on my own and other colleagues' experiences as well as what is mentioned in the literature.



          Verification in Optimization:



          Here, verification refers to the extent to which the model/solution method is performing in accordance with the initial modeling assumptions. Some useful methods are as follows:



          • Double-check the mathematics/coding of your model/algorithm with the problem definition and assumptions to make sure everything is OK (i.e., no assumption is violated or left out of the model/algorithm).

          • Check whether the solution provided by a solver or your implemented algorithm is sound and logical (e.g., it doesn't violate any constraint).

          • Feed deliberately designed (either by hand or the solutions obtained from the experts or the real system) feasible and infeasible solutions to your model and solution method and see if it can confirm the feasibility/infeasibility of the provided solution.

          • Fix some decision variables and/or change input data to see if the model/algorithm behavior changes. It's possible that the model is OK with some input data, but not with some others (this could either be due to the input data or the model itself).

          • Compare the solutions provided by the model (obtained via a solver) and your implemented solution method to see if there is any concerning disparity.

          Validation in Optimization:



          Here, validation refers to the extent to which the model/algorithm is satisfying the expectations of the problem owner. Some useful methods are as follows:



          • Compare your solution with the current solution in the system and see whether it can outperform the incumbent.

          • Use standard problem instances and compare your results with the best-known solutions form the literature.


          • If there are no standard problem instances, create random ones (either according to the literature with some tweaks or completely from scratch), and then do one of the following:



            • Compare your algorithm with solutions obtained from solving the mathematical model using a solver.


            • Compare your model/algorithm with solutions obtained from the state-of-the-art models/algorithms.


            • Develop a lower/upper bound on the objective function (e.g., using linear programming relaxation or Lagrangian relaxation). This could also be useful when you have real-world data from the system but no previously implemented solution exists.







          share|improve this answer











          $endgroup$










          • 1




            $begingroup$
            I haven't found literature that consider strategies for optimisation models in general, but this specific example of V&V usage in optimisation is worth taking a look.
            $endgroup$
            – TheSimpliFire
            Aug 7 at 12:50











          • $begingroup$
            This is already a good answer, there are only to points I miss: Visualization (also mentioned by @EhsanK below) and Simulation (in the sense that the solution to an optimization problem sometimes can be used to design a simulation that then is feed with real or random data to see how the system behaves under stochastic influences).
            $endgroup$
            – Philipp Christophel
            Aug 8 at 11:21










          • $begingroup$
            @philipp-christophel: Good points. Verification and validation of models under uncertainty might require a separate post containing a mixture of simulation, EVPI and VSS metrics, etc. I'll try to update my answer with some new points. Also, visualization is a must. I use it heavily when dealing with routing and network design problems.
            $endgroup$
            – Ehsan
            Aug 8 at 16:39



















          13












          $begingroup$

          This is just to complement the answer that @Ehsan gave.



          For both verification and validation, if possible, visualize your results. For example, if you are solving a routing problem, simply visualizing the resulting routes may provide great insights.



          • Does the solution (routes) make sense?

          • Are there constraints in the model that are violated and I had no idea about (assume you had no idea about subtour elimination and visualizing the routes may help you identify them)?

          • Should I add more constraints that I was not considering, simply by looking at the results? Imagine you see a route that starts in FL, goes to NY and then CA. Maybe it doesn't violate any of your current constraints, but it's something you don't like to have.

          Also, check this answer which shares a procedure for debugging an optimization code which can be very useful in your verification process.






          share|improve this answer









          $endgroup$






















            11












            $begingroup$

            For models being deployed in the "real world" (something that, as an academic, I have heard rumors about), one element of validation is to describe both the objective criteria and the constraints to the decision makers (in their language, not in algebraic terms) and see if they agree with you. Another is to run the model on historical inputs and see if the resulting solution makes sense to them. This is an opportunity for them to discover constraints they kinda sorta forgot to mention the first time around, or to modify the objective criterion.



            Switching to verification, if the model parameterized by historical inputs is infeasible, then you have likely found a flaw in the model. If the model using historical data is feasible but either (a) the historical solution is not feasible or (b) the historical solution is better than the "optimal" solution, the model is likely wrong.






            share|improve this answer









            $endgroup$






















              6












              $begingroup$

              Kleijnen (1995) analyses various methods of model verification and validation (V&V). Quoting the abstract (with slight changes to formatting):




              For verification it discusses



              1. general good programming practice (such as modular programming),


              2. checking intermediate simulation outputs through tracing and statistical testing per module,


              3. statistical testing of final simulation outputs against analytical results, and


              4. animation.


              For validation it discusses



              1. obtaining real-world data,


              2. comparing simulated and real data through simple tests such as graphical, Schruben-Turing, and t-tests,


              3. testing whether simulated and real responses are positively correlated and moreover have the same mean, using two new statistical procedures based on regression analysis,


              4. sensitivity analysis based on design of experiments and regression analysis, and risk or uncertainty analysis based on Monte Carlo sampling, and


              5. white versus black box simulation models.


              Both verification and validation require good documentation, and are crucial parts of assessment, credibility, and accreditation.





              Reference



              [1] Kleijnen, J. C. (1995). Verification and validation of simulation models. European Journal of Operational Research. 82(1):145-162. https://doi.org/10.1016/0377-2217(94)00016-6






              share|improve this answer









              $endgroup$














              • $begingroup$
                The OP's questions is about optimization. While the mentioned methods might apply to optimization models, they are more suitable for simulation models.
                $endgroup$
                – Ehsan
                Aug 7 at 12:44











              • $begingroup$
                @Ehsan Agreed. I answered since the OP wrote mathematical models but with optimisation in brackets. Therefore I would expect your answer to be more useful to them.
                $endgroup$
                – TheSimpliFire
                Aug 7 at 12:47


















              6












              $begingroup$

              I really like the paper by Coffin and Saltzman (INFORMS JOC, 2000), which argues that we should be using much more rigorous statistical tests when we compare the performance of one algorithm/heuristic against another.



              This is not exactly about validation (which I interpret as "checking correctness") but rather about comparison, so feel free to tell me if this answer is off-topic.






              share|improve this answer









              $endgroup$

















                Your Answer








                StackExchange.ready(function()
                var channelOptions =
                tags: "".split(" "),
                id: "700"
                ;
                initTagRenderer("".split(" "), "".split(" "), channelOptions);

                StackExchange.using("externalEditor", function()
                // Have to fire editor after snippets, if snippets enabled
                if (StackExchange.settings.snippets.snippetsEnabled)
                StackExchange.using("snippets", function()
                createEditor();
                );

                else
                createEditor();

                );

                function createEditor()
                StackExchange.prepareEditor(
                heartbeatType: 'answer',
                autoActivateHeartbeat: false,
                convertImagesToLinks: false,
                noModals: true,
                showLowRepImageUploadWarning: true,
                reputationToPostImages: null,
                bindNavPrevention: true,
                postfix: "",
                imageUploader:
                brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                allowUrls: true
                ,
                noCode: true, onDemand: true,
                discardSelector: ".discard-answer"
                ,immediatelyShowMarkdownHelp:true
                );



                );













                draft saved

                draft discarded


















                StackExchange.ready(
                function ()
                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2for.stackexchange.com%2fquestions%2f1193%2fvalidation-and-verification-of-mathematical-models%23new-answer', 'question_page');

                );

                Post as a guest















                Required, but never shown

























                5 Answers
                5






                active

                oldest

                votes








                5 Answers
                5






                active

                oldest

                votes









                active

                oldest

                votes






                active

                oldest

                votes









                19












                $begingroup$

                Generally speaking, verification refers to evaluating the conformation of a system, product, or service with its intended requirements and design specifications. On the other hand, validation refers to evaluate how much that system, product, or service is in accordance with the customers' or stakeholders' expectations.



                AFAIK, there is no concrete literature on verification and validation of optimization models. Over the years, I gathered some useful points and methods based on my own and other colleagues' experiences as well as what is mentioned in the literature.



                Verification in Optimization:



                Here, verification refers to the extent to which the model/solution method is performing in accordance with the initial modeling assumptions. Some useful methods are as follows:



                • Double-check the mathematics/coding of your model/algorithm with the problem definition and assumptions to make sure everything is OK (i.e., no assumption is violated or left out of the model/algorithm).

                • Check whether the solution provided by a solver or your implemented algorithm is sound and logical (e.g., it doesn't violate any constraint).

                • Feed deliberately designed (either by hand or the solutions obtained from the experts or the real system) feasible and infeasible solutions to your model and solution method and see if it can confirm the feasibility/infeasibility of the provided solution.

                • Fix some decision variables and/or change input data to see if the model/algorithm behavior changes. It's possible that the model is OK with some input data, but not with some others (this could either be due to the input data or the model itself).

                • Compare the solutions provided by the model (obtained via a solver) and your implemented solution method to see if there is any concerning disparity.

                Validation in Optimization:



                Here, validation refers to the extent to which the model/algorithm is satisfying the expectations of the problem owner. Some useful methods are as follows:



                • Compare your solution with the current solution in the system and see whether it can outperform the incumbent.

                • Use standard problem instances and compare your results with the best-known solutions form the literature.


                • If there are no standard problem instances, create random ones (either according to the literature with some tweaks or completely from scratch), and then do one of the following:



                  • Compare your algorithm with solutions obtained from solving the mathematical model using a solver.


                  • Compare your model/algorithm with solutions obtained from the state-of-the-art models/algorithms.


                  • Develop a lower/upper bound on the objective function (e.g., using linear programming relaxation or Lagrangian relaxation). This could also be useful when you have real-world data from the system but no previously implemented solution exists.







                share|improve this answer











                $endgroup$










                • 1




                  $begingroup$
                  I haven't found literature that consider strategies for optimisation models in general, but this specific example of V&V usage in optimisation is worth taking a look.
                  $endgroup$
                  – TheSimpliFire
                  Aug 7 at 12:50











                • $begingroup$
                  This is already a good answer, there are only to points I miss: Visualization (also mentioned by @EhsanK below) and Simulation (in the sense that the solution to an optimization problem sometimes can be used to design a simulation that then is feed with real or random data to see how the system behaves under stochastic influences).
                  $endgroup$
                  – Philipp Christophel
                  Aug 8 at 11:21










                • $begingroup$
                  @philipp-christophel: Good points. Verification and validation of models under uncertainty might require a separate post containing a mixture of simulation, EVPI and VSS metrics, etc. I'll try to update my answer with some new points. Also, visualization is a must. I use it heavily when dealing with routing and network design problems.
                  $endgroup$
                  – Ehsan
                  Aug 8 at 16:39
















                19












                $begingroup$

                Generally speaking, verification refers to evaluating the conformation of a system, product, or service with its intended requirements and design specifications. On the other hand, validation refers to evaluate how much that system, product, or service is in accordance with the customers' or stakeholders' expectations.



                AFAIK, there is no concrete literature on verification and validation of optimization models. Over the years, I gathered some useful points and methods based on my own and other colleagues' experiences as well as what is mentioned in the literature.



                Verification in Optimization:



                Here, verification refers to the extent to which the model/solution method is performing in accordance with the initial modeling assumptions. Some useful methods are as follows:



                • Double-check the mathematics/coding of your model/algorithm with the problem definition and assumptions to make sure everything is OK (i.e., no assumption is violated or left out of the model/algorithm).

                • Check whether the solution provided by a solver or your implemented algorithm is sound and logical (e.g., it doesn't violate any constraint).

                • Feed deliberately designed (either by hand or the solutions obtained from the experts or the real system) feasible and infeasible solutions to your model and solution method and see if it can confirm the feasibility/infeasibility of the provided solution.

                • Fix some decision variables and/or change input data to see if the model/algorithm behavior changes. It's possible that the model is OK with some input data, but not with some others (this could either be due to the input data or the model itself).

                • Compare the solutions provided by the model (obtained via a solver) and your implemented solution method to see if there is any concerning disparity.

                Validation in Optimization:



                Here, validation refers to the extent to which the model/algorithm is satisfying the expectations of the problem owner. Some useful methods are as follows:



                • Compare your solution with the current solution in the system and see whether it can outperform the incumbent.

                • Use standard problem instances and compare your results with the best-known solutions form the literature.


                • If there are no standard problem instances, create random ones (either according to the literature with some tweaks or completely from scratch), and then do one of the following:



                  • Compare your algorithm with solutions obtained from solving the mathematical model using a solver.


                  • Compare your model/algorithm with solutions obtained from the state-of-the-art models/algorithms.


                  • Develop a lower/upper bound on the objective function (e.g., using linear programming relaxation or Lagrangian relaxation). This could also be useful when you have real-world data from the system but no previously implemented solution exists.







                share|improve this answer











                $endgroup$










                • 1




                  $begingroup$
                  I haven't found literature that consider strategies for optimisation models in general, but this specific example of V&V usage in optimisation is worth taking a look.
                  $endgroup$
                  – TheSimpliFire
                  Aug 7 at 12:50











                • $begingroup$
                  This is already a good answer, there are only to points I miss: Visualization (also mentioned by @EhsanK below) and Simulation (in the sense that the solution to an optimization problem sometimes can be used to design a simulation that then is feed with real or random data to see how the system behaves under stochastic influences).
                  $endgroup$
                  – Philipp Christophel
                  Aug 8 at 11:21










                • $begingroup$
                  @philipp-christophel: Good points. Verification and validation of models under uncertainty might require a separate post containing a mixture of simulation, EVPI and VSS metrics, etc. I'll try to update my answer with some new points. Also, visualization is a must. I use it heavily when dealing with routing and network design problems.
                  $endgroup$
                  – Ehsan
                  Aug 8 at 16:39














                19












                19








                19





                $begingroup$

                Generally speaking, verification refers to evaluating the conformation of a system, product, or service with its intended requirements and design specifications. On the other hand, validation refers to evaluate how much that system, product, or service is in accordance with the customers' or stakeholders' expectations.



                AFAIK, there is no concrete literature on verification and validation of optimization models. Over the years, I gathered some useful points and methods based on my own and other colleagues' experiences as well as what is mentioned in the literature.



                Verification in Optimization:



                Here, verification refers to the extent to which the model/solution method is performing in accordance with the initial modeling assumptions. Some useful methods are as follows:



                • Double-check the mathematics/coding of your model/algorithm with the problem definition and assumptions to make sure everything is OK (i.e., no assumption is violated or left out of the model/algorithm).

                • Check whether the solution provided by a solver or your implemented algorithm is sound and logical (e.g., it doesn't violate any constraint).

                • Feed deliberately designed (either by hand or the solutions obtained from the experts or the real system) feasible and infeasible solutions to your model and solution method and see if it can confirm the feasibility/infeasibility of the provided solution.

                • Fix some decision variables and/or change input data to see if the model/algorithm behavior changes. It's possible that the model is OK with some input data, but not with some others (this could either be due to the input data or the model itself).

                • Compare the solutions provided by the model (obtained via a solver) and your implemented solution method to see if there is any concerning disparity.

                Validation in Optimization:



                Here, validation refers to the extent to which the model/algorithm is satisfying the expectations of the problem owner. Some useful methods are as follows:



                • Compare your solution with the current solution in the system and see whether it can outperform the incumbent.

                • Use standard problem instances and compare your results with the best-known solutions form the literature.


                • If there are no standard problem instances, create random ones (either according to the literature with some tweaks or completely from scratch), and then do one of the following:



                  • Compare your algorithm with solutions obtained from solving the mathematical model using a solver.


                  • Compare your model/algorithm with solutions obtained from the state-of-the-art models/algorithms.


                  • Develop a lower/upper bound on the objective function (e.g., using linear programming relaxation or Lagrangian relaxation). This could also be useful when you have real-world data from the system but no previously implemented solution exists.







                share|improve this answer











                $endgroup$



                Generally speaking, verification refers to evaluating the conformation of a system, product, or service with its intended requirements and design specifications. On the other hand, validation refers to evaluate how much that system, product, or service is in accordance with the customers' or stakeholders' expectations.



                AFAIK, there is no concrete literature on verification and validation of optimization models. Over the years, I gathered some useful points and methods based on my own and other colleagues' experiences as well as what is mentioned in the literature.



                Verification in Optimization:



                Here, verification refers to the extent to which the model/solution method is performing in accordance with the initial modeling assumptions. Some useful methods are as follows:



                • Double-check the mathematics/coding of your model/algorithm with the problem definition and assumptions to make sure everything is OK (i.e., no assumption is violated or left out of the model/algorithm).

                • Check whether the solution provided by a solver or your implemented algorithm is sound and logical (e.g., it doesn't violate any constraint).

                • Feed deliberately designed (either by hand or the solutions obtained from the experts or the real system) feasible and infeasible solutions to your model and solution method and see if it can confirm the feasibility/infeasibility of the provided solution.

                • Fix some decision variables and/or change input data to see if the model/algorithm behavior changes. It's possible that the model is OK with some input data, but not with some others (this could either be due to the input data or the model itself).

                • Compare the solutions provided by the model (obtained via a solver) and your implemented solution method to see if there is any concerning disparity.

                Validation in Optimization:



                Here, validation refers to the extent to which the model/algorithm is satisfying the expectations of the problem owner. Some useful methods are as follows:



                • Compare your solution with the current solution in the system and see whether it can outperform the incumbent.

                • Use standard problem instances and compare your results with the best-known solutions form the literature.


                • If there are no standard problem instances, create random ones (either according to the literature with some tweaks or completely from scratch), and then do one of the following:



                  • Compare your algorithm with solutions obtained from solving the mathematical model using a solver.


                  • Compare your model/algorithm with solutions obtained from the state-of-the-art models/algorithms.


                  • Develop a lower/upper bound on the objective function (e.g., using linear programming relaxation or Lagrangian relaxation). This could also be useful when you have real-world data from the system but no previously implemented solution exists.








                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited Aug 7 at 17:39

























                answered Aug 7 at 12:42









                EhsanEhsan

                1,1882 silver badges17 bronze badges




                1,1882 silver badges17 bronze badges










                • 1




                  $begingroup$
                  I haven't found literature that consider strategies for optimisation models in general, but this specific example of V&V usage in optimisation is worth taking a look.
                  $endgroup$
                  – TheSimpliFire
                  Aug 7 at 12:50











                • $begingroup$
                  This is already a good answer, there are only to points I miss: Visualization (also mentioned by @EhsanK below) and Simulation (in the sense that the solution to an optimization problem sometimes can be used to design a simulation that then is feed with real or random data to see how the system behaves under stochastic influences).
                  $endgroup$
                  – Philipp Christophel
                  Aug 8 at 11:21










                • $begingroup$
                  @philipp-christophel: Good points. Verification and validation of models under uncertainty might require a separate post containing a mixture of simulation, EVPI and VSS metrics, etc. I'll try to update my answer with some new points. Also, visualization is a must. I use it heavily when dealing with routing and network design problems.
                  $endgroup$
                  – Ehsan
                  Aug 8 at 16:39













                • 1




                  $begingroup$
                  I haven't found literature that consider strategies for optimisation models in general, but this specific example of V&V usage in optimisation is worth taking a look.
                  $endgroup$
                  – TheSimpliFire
                  Aug 7 at 12:50











                • $begingroup$
                  This is already a good answer, there are only to points I miss: Visualization (also mentioned by @EhsanK below) and Simulation (in the sense that the solution to an optimization problem sometimes can be used to design a simulation that then is feed with real or random data to see how the system behaves under stochastic influences).
                  $endgroup$
                  – Philipp Christophel
                  Aug 8 at 11:21










                • $begingroup$
                  @philipp-christophel: Good points. Verification and validation of models under uncertainty might require a separate post containing a mixture of simulation, EVPI and VSS metrics, etc. I'll try to update my answer with some new points. Also, visualization is a must. I use it heavily when dealing with routing and network design problems.
                  $endgroup$
                  – Ehsan
                  Aug 8 at 16:39








                1




                1




                $begingroup$
                I haven't found literature that consider strategies for optimisation models in general, but this specific example of V&V usage in optimisation is worth taking a look.
                $endgroup$
                – TheSimpliFire
                Aug 7 at 12:50





                $begingroup$
                I haven't found literature that consider strategies for optimisation models in general, but this specific example of V&V usage in optimisation is worth taking a look.
                $endgroup$
                – TheSimpliFire
                Aug 7 at 12:50













                $begingroup$
                This is already a good answer, there are only to points I miss: Visualization (also mentioned by @EhsanK below) and Simulation (in the sense that the solution to an optimization problem sometimes can be used to design a simulation that then is feed with real or random data to see how the system behaves under stochastic influences).
                $endgroup$
                – Philipp Christophel
                Aug 8 at 11:21




                $begingroup$
                This is already a good answer, there are only to points I miss: Visualization (also mentioned by @EhsanK below) and Simulation (in the sense that the solution to an optimization problem sometimes can be used to design a simulation that then is feed with real or random data to see how the system behaves under stochastic influences).
                $endgroup$
                – Philipp Christophel
                Aug 8 at 11:21












                $begingroup$
                @philipp-christophel: Good points. Verification and validation of models under uncertainty might require a separate post containing a mixture of simulation, EVPI and VSS metrics, etc. I'll try to update my answer with some new points. Also, visualization is a must. I use it heavily when dealing with routing and network design problems.
                $endgroup$
                – Ehsan
                Aug 8 at 16:39





                $begingroup$
                @philipp-christophel: Good points. Verification and validation of models under uncertainty might require a separate post containing a mixture of simulation, EVPI and VSS metrics, etc. I'll try to update my answer with some new points. Also, visualization is a must. I use it heavily when dealing with routing and network design problems.
                $endgroup$
                – Ehsan
                Aug 8 at 16:39












                13












                $begingroup$

                This is just to complement the answer that @Ehsan gave.



                For both verification and validation, if possible, visualize your results. For example, if you are solving a routing problem, simply visualizing the resulting routes may provide great insights.



                • Does the solution (routes) make sense?

                • Are there constraints in the model that are violated and I had no idea about (assume you had no idea about subtour elimination and visualizing the routes may help you identify them)?

                • Should I add more constraints that I was not considering, simply by looking at the results? Imagine you see a route that starts in FL, goes to NY and then CA. Maybe it doesn't violate any of your current constraints, but it's something you don't like to have.

                Also, check this answer which shares a procedure for debugging an optimization code which can be very useful in your verification process.






                share|improve this answer









                $endgroup$



















                  13












                  $begingroup$

                  This is just to complement the answer that @Ehsan gave.



                  For both verification and validation, if possible, visualize your results. For example, if you are solving a routing problem, simply visualizing the resulting routes may provide great insights.



                  • Does the solution (routes) make sense?

                  • Are there constraints in the model that are violated and I had no idea about (assume you had no idea about subtour elimination and visualizing the routes may help you identify them)?

                  • Should I add more constraints that I was not considering, simply by looking at the results? Imagine you see a route that starts in FL, goes to NY and then CA. Maybe it doesn't violate any of your current constraints, but it's something you don't like to have.

                  Also, check this answer which shares a procedure for debugging an optimization code which can be very useful in your verification process.






                  share|improve this answer









                  $endgroup$

















                    13












                    13








                    13





                    $begingroup$

                    This is just to complement the answer that @Ehsan gave.



                    For both verification and validation, if possible, visualize your results. For example, if you are solving a routing problem, simply visualizing the resulting routes may provide great insights.



                    • Does the solution (routes) make sense?

                    • Are there constraints in the model that are violated and I had no idea about (assume you had no idea about subtour elimination and visualizing the routes may help you identify them)?

                    • Should I add more constraints that I was not considering, simply by looking at the results? Imagine you see a route that starts in FL, goes to NY and then CA. Maybe it doesn't violate any of your current constraints, but it's something you don't like to have.

                    Also, check this answer which shares a procedure for debugging an optimization code which can be very useful in your verification process.






                    share|improve this answer









                    $endgroup$



                    This is just to complement the answer that @Ehsan gave.



                    For both verification and validation, if possible, visualize your results. For example, if you are solving a routing problem, simply visualizing the resulting routes may provide great insights.



                    • Does the solution (routes) make sense?

                    • Are there constraints in the model that are violated and I had no idea about (assume you had no idea about subtour elimination and visualizing the routes may help you identify them)?

                    • Should I add more constraints that I was not considering, simply by looking at the results? Imagine you see a route that starts in FL, goes to NY and then CA. Maybe it doesn't violate any of your current constraints, but it's something you don't like to have.

                    Also, check this answer which shares a procedure for debugging an optimization code which can be very useful in your verification process.







                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered Aug 7 at 13:20









                    EhsanKEhsanK

                    2,2084 silver badges30 bronze badges




                    2,2084 silver badges30 bronze badges
























                        11












                        $begingroup$

                        For models being deployed in the "real world" (something that, as an academic, I have heard rumors about), one element of validation is to describe both the objective criteria and the constraints to the decision makers (in their language, not in algebraic terms) and see if they agree with you. Another is to run the model on historical inputs and see if the resulting solution makes sense to them. This is an opportunity for them to discover constraints they kinda sorta forgot to mention the first time around, or to modify the objective criterion.



                        Switching to verification, if the model parameterized by historical inputs is infeasible, then you have likely found a flaw in the model. If the model using historical data is feasible but either (a) the historical solution is not feasible or (b) the historical solution is better than the "optimal" solution, the model is likely wrong.






                        share|improve this answer









                        $endgroup$



















                          11












                          $begingroup$

                          For models being deployed in the "real world" (something that, as an academic, I have heard rumors about), one element of validation is to describe both the objective criteria and the constraints to the decision makers (in their language, not in algebraic terms) and see if they agree with you. Another is to run the model on historical inputs and see if the resulting solution makes sense to them. This is an opportunity for them to discover constraints they kinda sorta forgot to mention the first time around, or to modify the objective criterion.



                          Switching to verification, if the model parameterized by historical inputs is infeasible, then you have likely found a flaw in the model. If the model using historical data is feasible but either (a) the historical solution is not feasible or (b) the historical solution is better than the "optimal" solution, the model is likely wrong.






                          share|improve this answer









                          $endgroup$

















                            11












                            11








                            11





                            $begingroup$

                            For models being deployed in the "real world" (something that, as an academic, I have heard rumors about), one element of validation is to describe both the objective criteria and the constraints to the decision makers (in their language, not in algebraic terms) and see if they agree with you. Another is to run the model on historical inputs and see if the resulting solution makes sense to them. This is an opportunity for them to discover constraints they kinda sorta forgot to mention the first time around, or to modify the objective criterion.



                            Switching to verification, if the model parameterized by historical inputs is infeasible, then you have likely found a flaw in the model. If the model using historical data is feasible but either (a) the historical solution is not feasible or (b) the historical solution is better than the "optimal" solution, the model is likely wrong.






                            share|improve this answer









                            $endgroup$



                            For models being deployed in the "real world" (something that, as an academic, I have heard rumors about), one element of validation is to describe both the objective criteria and the constraints to the decision makers (in their language, not in algebraic terms) and see if they agree with you. Another is to run the model on historical inputs and see if the resulting solution makes sense to them. This is an opportunity for them to discover constraints they kinda sorta forgot to mention the first time around, or to modify the objective criterion.



                            Switching to verification, if the model parameterized by historical inputs is infeasible, then you have likely found a flaw in the model. If the model using historical data is feasible but either (a) the historical solution is not feasible or (b) the historical solution is better than the "optimal" solution, the model is likely wrong.







                            share|improve this answer












                            share|improve this answer



                            share|improve this answer










                            answered Aug 7 at 20:29









                            prubinprubin

                            3,4415 silver badges25 bronze badges




                            3,4415 silver badges25 bronze badges
























                                6












                                $begingroup$

                                Kleijnen (1995) analyses various methods of model verification and validation (V&V). Quoting the abstract (with slight changes to formatting):




                                For verification it discusses



                                1. general good programming practice (such as modular programming),


                                2. checking intermediate simulation outputs through tracing and statistical testing per module,


                                3. statistical testing of final simulation outputs against analytical results, and


                                4. animation.


                                For validation it discusses



                                1. obtaining real-world data,


                                2. comparing simulated and real data through simple tests such as graphical, Schruben-Turing, and t-tests,


                                3. testing whether simulated and real responses are positively correlated and moreover have the same mean, using two new statistical procedures based on regression analysis,


                                4. sensitivity analysis based on design of experiments and regression analysis, and risk or uncertainty analysis based on Monte Carlo sampling, and


                                5. white versus black box simulation models.


                                Both verification and validation require good documentation, and are crucial parts of assessment, credibility, and accreditation.





                                Reference



                                [1] Kleijnen, J. C. (1995). Verification and validation of simulation models. European Journal of Operational Research. 82(1):145-162. https://doi.org/10.1016/0377-2217(94)00016-6






                                share|improve this answer









                                $endgroup$














                                • $begingroup$
                                  The OP's questions is about optimization. While the mentioned methods might apply to optimization models, they are more suitable for simulation models.
                                  $endgroup$
                                  – Ehsan
                                  Aug 7 at 12:44











                                • $begingroup$
                                  @Ehsan Agreed. I answered since the OP wrote mathematical models but with optimisation in brackets. Therefore I would expect your answer to be more useful to them.
                                  $endgroup$
                                  – TheSimpliFire
                                  Aug 7 at 12:47















                                6












                                $begingroup$

                                Kleijnen (1995) analyses various methods of model verification and validation (V&V). Quoting the abstract (with slight changes to formatting):




                                For verification it discusses



                                1. general good programming practice (such as modular programming),


                                2. checking intermediate simulation outputs through tracing and statistical testing per module,


                                3. statistical testing of final simulation outputs against analytical results, and


                                4. animation.


                                For validation it discusses



                                1. obtaining real-world data,


                                2. comparing simulated and real data through simple tests such as graphical, Schruben-Turing, and t-tests,


                                3. testing whether simulated and real responses are positively correlated and moreover have the same mean, using two new statistical procedures based on regression analysis,


                                4. sensitivity analysis based on design of experiments and regression analysis, and risk or uncertainty analysis based on Monte Carlo sampling, and


                                5. white versus black box simulation models.


                                Both verification and validation require good documentation, and are crucial parts of assessment, credibility, and accreditation.





                                Reference



                                [1] Kleijnen, J. C. (1995). Verification and validation of simulation models. European Journal of Operational Research. 82(1):145-162. https://doi.org/10.1016/0377-2217(94)00016-6






                                share|improve this answer









                                $endgroup$














                                • $begingroup$
                                  The OP's questions is about optimization. While the mentioned methods might apply to optimization models, they are more suitable for simulation models.
                                  $endgroup$
                                  – Ehsan
                                  Aug 7 at 12:44











                                • $begingroup$
                                  @Ehsan Agreed. I answered since the OP wrote mathematical models but with optimisation in brackets. Therefore I would expect your answer to be more useful to them.
                                  $endgroup$
                                  – TheSimpliFire
                                  Aug 7 at 12:47













                                6












                                6








                                6





                                $begingroup$

                                Kleijnen (1995) analyses various methods of model verification and validation (V&V). Quoting the abstract (with slight changes to formatting):




                                For verification it discusses



                                1. general good programming practice (such as modular programming),


                                2. checking intermediate simulation outputs through tracing and statistical testing per module,


                                3. statistical testing of final simulation outputs against analytical results, and


                                4. animation.


                                For validation it discusses



                                1. obtaining real-world data,


                                2. comparing simulated and real data through simple tests such as graphical, Schruben-Turing, and t-tests,


                                3. testing whether simulated and real responses are positively correlated and moreover have the same mean, using two new statistical procedures based on regression analysis,


                                4. sensitivity analysis based on design of experiments and regression analysis, and risk or uncertainty analysis based on Monte Carlo sampling, and


                                5. white versus black box simulation models.


                                Both verification and validation require good documentation, and are crucial parts of assessment, credibility, and accreditation.





                                Reference



                                [1] Kleijnen, J. C. (1995). Verification and validation of simulation models. European Journal of Operational Research. 82(1):145-162. https://doi.org/10.1016/0377-2217(94)00016-6






                                share|improve this answer









                                $endgroup$



                                Kleijnen (1995) analyses various methods of model verification and validation (V&V). Quoting the abstract (with slight changes to formatting):




                                For verification it discusses



                                1. general good programming practice (such as modular programming),


                                2. checking intermediate simulation outputs through tracing and statistical testing per module,


                                3. statistical testing of final simulation outputs against analytical results, and


                                4. animation.


                                For validation it discusses



                                1. obtaining real-world data,


                                2. comparing simulated and real data through simple tests such as graphical, Schruben-Turing, and t-tests,


                                3. testing whether simulated and real responses are positively correlated and moreover have the same mean, using two new statistical procedures based on regression analysis,


                                4. sensitivity analysis based on design of experiments and regression analysis, and risk or uncertainty analysis based on Monte Carlo sampling, and


                                5. white versus black box simulation models.


                                Both verification and validation require good documentation, and are crucial parts of assessment, credibility, and accreditation.





                                Reference



                                [1] Kleijnen, J. C. (1995). Verification and validation of simulation models. European Journal of Operational Research. 82(1):145-162. https://doi.org/10.1016/0377-2217(94)00016-6







                                share|improve this answer












                                share|improve this answer



                                share|improve this answer










                                answered Aug 7 at 12:36









                                TheSimpliFireTheSimpliFire

                                2,1916 silver badges39 bronze badges




                                2,1916 silver badges39 bronze badges














                                • $begingroup$
                                  The OP's questions is about optimization. While the mentioned methods might apply to optimization models, they are more suitable for simulation models.
                                  $endgroup$
                                  – Ehsan
                                  Aug 7 at 12:44











                                • $begingroup$
                                  @Ehsan Agreed. I answered since the OP wrote mathematical models but with optimisation in brackets. Therefore I would expect your answer to be more useful to them.
                                  $endgroup$
                                  – TheSimpliFire
                                  Aug 7 at 12:47
















                                • $begingroup$
                                  The OP's questions is about optimization. While the mentioned methods might apply to optimization models, they are more suitable for simulation models.
                                  $endgroup$
                                  – Ehsan
                                  Aug 7 at 12:44











                                • $begingroup$
                                  @Ehsan Agreed. I answered since the OP wrote mathematical models but with optimisation in brackets. Therefore I would expect your answer to be more useful to them.
                                  $endgroup$
                                  – TheSimpliFire
                                  Aug 7 at 12:47















                                $begingroup$
                                The OP's questions is about optimization. While the mentioned methods might apply to optimization models, they are more suitable for simulation models.
                                $endgroup$
                                – Ehsan
                                Aug 7 at 12:44





                                $begingroup$
                                The OP's questions is about optimization. While the mentioned methods might apply to optimization models, they are more suitable for simulation models.
                                $endgroup$
                                – Ehsan
                                Aug 7 at 12:44













                                $begingroup$
                                @Ehsan Agreed. I answered since the OP wrote mathematical models but with optimisation in brackets. Therefore I would expect your answer to be more useful to them.
                                $endgroup$
                                – TheSimpliFire
                                Aug 7 at 12:47




                                $begingroup$
                                @Ehsan Agreed. I answered since the OP wrote mathematical models but with optimisation in brackets. Therefore I would expect your answer to be more useful to them.
                                $endgroup$
                                – TheSimpliFire
                                Aug 7 at 12:47











                                6












                                $begingroup$

                                I really like the paper by Coffin and Saltzman (INFORMS JOC, 2000), which argues that we should be using much more rigorous statistical tests when we compare the performance of one algorithm/heuristic against another.



                                This is not exactly about validation (which I interpret as "checking correctness") but rather about comparison, so feel free to tell me if this answer is off-topic.






                                share|improve this answer









                                $endgroup$



















                                  6












                                  $begingroup$

                                  I really like the paper by Coffin and Saltzman (INFORMS JOC, 2000), which argues that we should be using much more rigorous statistical tests when we compare the performance of one algorithm/heuristic against another.



                                  This is not exactly about validation (which I interpret as "checking correctness") but rather about comparison, so feel free to tell me if this answer is off-topic.






                                  share|improve this answer









                                  $endgroup$

















                                    6












                                    6








                                    6





                                    $begingroup$

                                    I really like the paper by Coffin and Saltzman (INFORMS JOC, 2000), which argues that we should be using much more rigorous statistical tests when we compare the performance of one algorithm/heuristic against another.



                                    This is not exactly about validation (which I interpret as "checking correctness") but rather about comparison, so feel free to tell me if this answer is off-topic.






                                    share|improve this answer









                                    $endgroup$



                                    I really like the paper by Coffin and Saltzman (INFORMS JOC, 2000), which argues that we should be using much more rigorous statistical tests when we compare the performance of one algorithm/heuristic against another.



                                    This is not exactly about validation (which I interpret as "checking correctness") but rather about comparison, so feel free to tell me if this answer is off-topic.







                                    share|improve this answer












                                    share|improve this answer



                                    share|improve this answer










                                    answered Aug 7 at 14:06









                                    LarrySnyder610LarrySnyder610

                                    5,62214 silver badges64 bronze badges




                                    5,62214 silver badges64 bronze badges






























                                        draft saved

                                        draft discarded
















































                                        Thanks for contributing an answer to Operations Research Stack Exchange!


                                        • Please be sure to answer the question. Provide details and share your research!

                                        But avoid


                                        • Asking for help, clarification, or responding to other answers.

                                        • Making statements based on opinion; back them up with references or personal experience.

                                        Use MathJax to format equations. MathJax reference.


                                        To learn more, see our tips on writing great answers.




                                        draft saved


                                        draft discarded














                                        StackExchange.ready(
                                        function ()
                                        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2for.stackexchange.com%2fquestions%2f1193%2fvalidation-and-verification-of-mathematical-models%23new-answer', 'question_page');

                                        );

                                        Post as a guest















                                        Required, but never shown





















































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown

































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown







                                        Popular posts from this blog

                                        Get product attribute by attribute group code in magento 2get product attribute by product attribute group in magento 2Magento 2 Log Bundle Product Data in List Page?How to get all product attribute of a attribute group of Default attribute set?Magento 2.1 Create a filter in the product grid by new attributeMagento 2 : Get Product Attribute values By GroupMagento 2 How to get all existing values for one attributeMagento 2 get custom attribute of a single product inside a pluginMagento 2.3 How to get all the Multi Source Inventory (MSI) locations collection in custom module?Magento2: how to develop rest API to get new productsGet product attribute by attribute group code ( [attribute_group_code] ) in magento 2

                                        Category:9 (number) SubcategoriesMedia in category "9 (number)"Navigation menuUpload mediaGND ID: 4485639-8Library of Congress authority ID: sh85091979ReasonatorScholiaStatistics

                                        Magento 2.3: How do i solve this, Not registered handle, on custom form?How can i rewrite TierPrice Block in Magento2magento 2 captcha not rendering if I override layout xmlmain.CRITICAL: Plugin class doesn't existMagento 2 : Problem while adding custom button order view page?Magento 2.2.5: Overriding Admin Controller sales/orderMagento 2.2.5: Add, Update and Delete existing products Custom OptionsMagento 2.3 : File Upload issue in UI Component FormMagento2 Not registered handleHow to configured Form Builder Js in my custom magento 2.3.0 module?Magento 2.3. How to create image upload field in an admin form