use MCMC posterior as prior for future inferenceHow to compute initial weights of SMC, when you initialize the algorithm with MCMC drawsSampling distribution is skewed in a fully Bayesian inference of MCMC in Cox PH modelsMCMC for an explicitly uncomputable prior?Sampling from Posterior without MCMCHow to include prior information about target pdf in MCMCGenerate Posterior predictive distribution at every step in the MCMC chain for a hierarchical regression modelIs there a Monte Carlo/MCMC sampler implemented which can deal with isolated local maxima of posterior distribution?Should MCMC posterior be used as my new prior?The meaning of Bayesian update

Why does matter stay collapsed in the core, following a supernova explosion?

Why does a sticker slowly peel off, but if it is pulled quickly it tears?

Pen test results for web application include a file from a forbidden directory that is not even used or referenced

Will removing shelving screws from studs damage the studs?

How to pass 2>/dev/null as a variable?

Does NASA use any type of office/groupware software and which is that?

Why didn't Doc believe Marty was from the future?

Is it unusual for a math department not to have a mail/web server?

Why does Windows store Wi-Fi passwords in a reversible format?

What's the point of fighting monsters in Zelda BoTW?

Alternatives to Network Backup

Using a JoeBlow Sport pump on a presta valve

Is a Centaur PC considered an animal when calculating carrying capacity for vehicles?

Is it true that different variants of the same model aircraft don't require pilot retraining?

How to prevent a hosting company from accessing a VM's encryption keys?

Why does this London Underground poster from 1924 have a Star of David atop a Christmas tree?

Should an STL container avoid copying elements into themselves when the container is copied into itself?

Can someone identify this unusual plane at airport?

Shift lens vs move body?

What is Soda Fountain Etiquette?

Why did the population of Bhutan drop by 70% between 2007 and 2008?

Term used to describe a person who predicts future outcomes

Is there a word or phrase that means "use other people's wifi or Internet service without consent"?

and daughters were born to them (bereishis 6:1)



use MCMC posterior as prior for future inference


How to compute initial weights of SMC, when you initialize the algorithm with MCMC drawsSampling distribution is skewed in a fully Bayesian inference of MCMC in Cox PH modelsMCMC for an explicitly uncomputable prior?Sampling from Posterior without MCMCHow to include prior information about target pdf in MCMCGenerate Posterior predictive distribution at every step in the MCMC chain for a hierarchical regression modelIs there a Monte Carlo/MCMC sampler implemented which can deal with isolated local maxima of posterior distribution?Should MCMC posterior be used as my new prior?The meaning of Bayesian update






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








4












$begingroup$


Would you kindly let me know how to use the estimated posterior distribution as the prior of another Bayesian update? Or even use that in an iterative manner, e.g. in my case the posterior is updated according to a spatial correlated prior update? Would this be very inefficient rather than doing all at once?










share|cite|improve this question









$endgroup$













  • $begingroup$
    Some keywords for a deeper search would be particle filter and SMC for sequential Monte Carlo.
    $endgroup$
    – Xi'an
    Aug 15 at 1:40










  • $begingroup$
    Related question: stats.stackexchange.com/q/202342/173437
    $endgroup$
    – John Zito
    Aug 15 at 4:33

















4












$begingroup$


Would you kindly let me know how to use the estimated posterior distribution as the prior of another Bayesian update? Or even use that in an iterative manner, e.g. in my case the posterior is updated according to a spatial correlated prior update? Would this be very inefficient rather than doing all at once?










share|cite|improve this question









$endgroup$













  • $begingroup$
    Some keywords for a deeper search would be particle filter and SMC for sequential Monte Carlo.
    $endgroup$
    – Xi'an
    Aug 15 at 1:40










  • $begingroup$
    Related question: stats.stackexchange.com/q/202342/173437
    $endgroup$
    – John Zito
    Aug 15 at 4:33













4












4








4


2



$begingroup$


Would you kindly let me know how to use the estimated posterior distribution as the prior of another Bayesian update? Or even use that in an iterative manner, e.g. in my case the posterior is updated according to a spatial correlated prior update? Would this be very inefficient rather than doing all at once?










share|cite|improve this question









$endgroup$




Would you kindly let me know how to use the estimated posterior distribution as the prior of another Bayesian update? Or even use that in an iterative manner, e.g. in my case the posterior is updated according to a spatial correlated prior update? Would this be very inefficient rather than doing all at once?







mcmc markov-random-field






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Aug 15 at 0:55









colddiecolddie

1234 bronze badges




1234 bronze badges














  • $begingroup$
    Some keywords for a deeper search would be particle filter and SMC for sequential Monte Carlo.
    $endgroup$
    – Xi'an
    Aug 15 at 1:40










  • $begingroup$
    Related question: stats.stackexchange.com/q/202342/173437
    $endgroup$
    – John Zito
    Aug 15 at 4:33
















  • $begingroup$
    Some keywords for a deeper search would be particle filter and SMC for sequential Monte Carlo.
    $endgroup$
    – Xi'an
    Aug 15 at 1:40










  • $begingroup$
    Related question: stats.stackexchange.com/q/202342/173437
    $endgroup$
    – John Zito
    Aug 15 at 4:33















$begingroup$
Some keywords for a deeper search would be particle filter and SMC for sequential Monte Carlo.
$endgroup$
– Xi'an
Aug 15 at 1:40




$begingroup$
Some keywords for a deeper search would be particle filter and SMC for sequential Monte Carlo.
$endgroup$
– Xi'an
Aug 15 at 1:40












$begingroup$
Related question: stats.stackexchange.com/q/202342/173437
$endgroup$
– John Zito
Aug 15 at 4:33




$begingroup$
Related question: stats.stackexchange.com/q/202342/173437
$endgroup$
– John Zito
Aug 15 at 4:33










1 Answer
1






active

oldest

votes


















6













$begingroup$

Strictly speaking, you have to rerun your MCMC algorithm from scratch to approximate the new posterior. MCMC algorithms are not sequential, which means that you cannot update their output with new data to update your estimate of the posterior. You just have to redo it.



However, you can use importance sampling to recursively update your posterior approximation with new data. Here are two approaches:



Quick and Dirty (and not quite right)



You already have the output $theta^(i)_i=1^M$ from an MCMC algorithm that targets $p(theta,|,y_1:t-1)$. You then observe $y_t$, and you want to somehow recycle $theta^(i)_i=1^M$ to approximate $p(theta,|,y_1:t)$ without having to re-do everything. As I said, in order to be doing things 100% correctly, you should rerun the MCMC from scratch. But if you were hellbent on not doing that, you could do the following. Pretend that $theta^(i)_i=1^M$ are iid draws from $p(theta,|,y_1:t-1)$. Then treat them as proposal draws for an importance sampling approximation to $p(theta,|,y_1:t)$. The importance weights will be



$$w_iproptofrac,y_1:t)p(theta^(i),propto p(y_t,|y_1:t-1,,theta^(i)).$$



The leap of faith here is treating the MCMC draws like they were evenly-weighted, iid draws from the source density $p(theta,|,y_1:t-1)$. But for private, exploratory purposes, it's not an insane thing to do when you already have the MCMC draws lying around and you want to update the approximation based on one or two new observations.



Best Practices



If you know in advance that you'll want to be recursively updating your posterior approximation when you observe new data, the best thing to do from the outset is to use sequential Monte Carlo (SMC) to approximate the posterior. Here are some papers:




  • Chopin (2002 Biometrika);


  • Chopin (2004 Annals of Statistics).

Like the other approach, SMC is an importance sampling based method that allows you to iteratively update your posterior approximation as new data arrive. You start with a sample of iid draws from the prior, and then you recursively re-weight the sample to reflect the new information. Along the way, you also use MCMC to move each draw in your sample to a location in the parameter space that better reflects the influence of new data.






share|cite|improve this answer











$endgroup$

















    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "65"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f422259%2fuse-mcmc-posterior-as-prior-for-future-inference%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    6













    $begingroup$

    Strictly speaking, you have to rerun your MCMC algorithm from scratch to approximate the new posterior. MCMC algorithms are not sequential, which means that you cannot update their output with new data to update your estimate of the posterior. You just have to redo it.



    However, you can use importance sampling to recursively update your posterior approximation with new data. Here are two approaches:



    Quick and Dirty (and not quite right)



    You already have the output $theta^(i)_i=1^M$ from an MCMC algorithm that targets $p(theta,|,y_1:t-1)$. You then observe $y_t$, and you want to somehow recycle $theta^(i)_i=1^M$ to approximate $p(theta,|,y_1:t)$ without having to re-do everything. As I said, in order to be doing things 100% correctly, you should rerun the MCMC from scratch. But if you were hellbent on not doing that, you could do the following. Pretend that $theta^(i)_i=1^M$ are iid draws from $p(theta,|,y_1:t-1)$. Then treat them as proposal draws for an importance sampling approximation to $p(theta,|,y_1:t)$. The importance weights will be



    $$w_iproptofrac,y_1:t)p(theta^(i),propto p(y_t,|y_1:t-1,,theta^(i)).$$



    The leap of faith here is treating the MCMC draws like they were evenly-weighted, iid draws from the source density $p(theta,|,y_1:t-1)$. But for private, exploratory purposes, it's not an insane thing to do when you already have the MCMC draws lying around and you want to update the approximation based on one or two new observations.



    Best Practices



    If you know in advance that you'll want to be recursively updating your posterior approximation when you observe new data, the best thing to do from the outset is to use sequential Monte Carlo (SMC) to approximate the posterior. Here are some papers:




    • Chopin (2002 Biometrika);


    • Chopin (2004 Annals of Statistics).

    Like the other approach, SMC is an importance sampling based method that allows you to iteratively update your posterior approximation as new data arrive. You start with a sample of iid draws from the prior, and then you recursively re-weight the sample to reflect the new information. Along the way, you also use MCMC to move each draw in your sample to a location in the parameter space that better reflects the influence of new data.






    share|cite|improve this answer











    $endgroup$



















      6













      $begingroup$

      Strictly speaking, you have to rerun your MCMC algorithm from scratch to approximate the new posterior. MCMC algorithms are not sequential, which means that you cannot update their output with new data to update your estimate of the posterior. You just have to redo it.



      However, you can use importance sampling to recursively update your posterior approximation with new data. Here are two approaches:



      Quick and Dirty (and not quite right)



      You already have the output $theta^(i)_i=1^M$ from an MCMC algorithm that targets $p(theta,|,y_1:t-1)$. You then observe $y_t$, and you want to somehow recycle $theta^(i)_i=1^M$ to approximate $p(theta,|,y_1:t)$ without having to re-do everything. As I said, in order to be doing things 100% correctly, you should rerun the MCMC from scratch. But if you were hellbent on not doing that, you could do the following. Pretend that $theta^(i)_i=1^M$ are iid draws from $p(theta,|,y_1:t-1)$. Then treat them as proposal draws for an importance sampling approximation to $p(theta,|,y_1:t)$. The importance weights will be



      $$w_iproptofrac,y_1:t)p(theta^(i),propto p(y_t,|y_1:t-1,,theta^(i)).$$



      The leap of faith here is treating the MCMC draws like they were evenly-weighted, iid draws from the source density $p(theta,|,y_1:t-1)$. But for private, exploratory purposes, it's not an insane thing to do when you already have the MCMC draws lying around and you want to update the approximation based on one or two new observations.



      Best Practices



      If you know in advance that you'll want to be recursively updating your posterior approximation when you observe new data, the best thing to do from the outset is to use sequential Monte Carlo (SMC) to approximate the posterior. Here are some papers:




      • Chopin (2002 Biometrika);


      • Chopin (2004 Annals of Statistics).

      Like the other approach, SMC is an importance sampling based method that allows you to iteratively update your posterior approximation as new data arrive. You start with a sample of iid draws from the prior, and then you recursively re-weight the sample to reflect the new information. Along the way, you also use MCMC to move each draw in your sample to a location in the parameter space that better reflects the influence of new data.






      share|cite|improve this answer











      $endgroup$

















        6














        6










        6







        $begingroup$

        Strictly speaking, you have to rerun your MCMC algorithm from scratch to approximate the new posterior. MCMC algorithms are not sequential, which means that you cannot update their output with new data to update your estimate of the posterior. You just have to redo it.



        However, you can use importance sampling to recursively update your posterior approximation with new data. Here are two approaches:



        Quick and Dirty (and not quite right)



        You already have the output $theta^(i)_i=1^M$ from an MCMC algorithm that targets $p(theta,|,y_1:t-1)$. You then observe $y_t$, and you want to somehow recycle $theta^(i)_i=1^M$ to approximate $p(theta,|,y_1:t)$ without having to re-do everything. As I said, in order to be doing things 100% correctly, you should rerun the MCMC from scratch. But if you were hellbent on not doing that, you could do the following. Pretend that $theta^(i)_i=1^M$ are iid draws from $p(theta,|,y_1:t-1)$. Then treat them as proposal draws for an importance sampling approximation to $p(theta,|,y_1:t)$. The importance weights will be



        $$w_iproptofrac,y_1:t)p(theta^(i),propto p(y_t,|y_1:t-1,,theta^(i)).$$



        The leap of faith here is treating the MCMC draws like they were evenly-weighted, iid draws from the source density $p(theta,|,y_1:t-1)$. But for private, exploratory purposes, it's not an insane thing to do when you already have the MCMC draws lying around and you want to update the approximation based on one or two new observations.



        Best Practices



        If you know in advance that you'll want to be recursively updating your posterior approximation when you observe new data, the best thing to do from the outset is to use sequential Monte Carlo (SMC) to approximate the posterior. Here are some papers:




        • Chopin (2002 Biometrika);


        • Chopin (2004 Annals of Statistics).

        Like the other approach, SMC is an importance sampling based method that allows you to iteratively update your posterior approximation as new data arrive. You start with a sample of iid draws from the prior, and then you recursively re-weight the sample to reflect the new information. Along the way, you also use MCMC to move each draw in your sample to a location in the parameter space that better reflects the influence of new data.






        share|cite|improve this answer











        $endgroup$



        Strictly speaking, you have to rerun your MCMC algorithm from scratch to approximate the new posterior. MCMC algorithms are not sequential, which means that you cannot update their output with new data to update your estimate of the posterior. You just have to redo it.



        However, you can use importance sampling to recursively update your posterior approximation with new data. Here are two approaches:



        Quick and Dirty (and not quite right)



        You already have the output $theta^(i)_i=1^M$ from an MCMC algorithm that targets $p(theta,|,y_1:t-1)$. You then observe $y_t$, and you want to somehow recycle $theta^(i)_i=1^M$ to approximate $p(theta,|,y_1:t)$ without having to re-do everything. As I said, in order to be doing things 100% correctly, you should rerun the MCMC from scratch. But if you were hellbent on not doing that, you could do the following. Pretend that $theta^(i)_i=1^M$ are iid draws from $p(theta,|,y_1:t-1)$. Then treat them as proposal draws for an importance sampling approximation to $p(theta,|,y_1:t)$. The importance weights will be



        $$w_iproptofrac,y_1:t)p(theta^(i),propto p(y_t,|y_1:t-1,,theta^(i)).$$



        The leap of faith here is treating the MCMC draws like they were evenly-weighted, iid draws from the source density $p(theta,|,y_1:t-1)$. But for private, exploratory purposes, it's not an insane thing to do when you already have the MCMC draws lying around and you want to update the approximation based on one or two new observations.



        Best Practices



        If you know in advance that you'll want to be recursively updating your posterior approximation when you observe new data, the best thing to do from the outset is to use sequential Monte Carlo (SMC) to approximate the posterior. Here are some papers:




        • Chopin (2002 Biometrika);


        • Chopin (2004 Annals of Statistics).

        Like the other approach, SMC is an importance sampling based method that allows you to iteratively update your posterior approximation as new data arrive. You start with a sample of iid draws from the prior, and then you recursively re-weight the sample to reflect the new information. Along the way, you also use MCMC to move each draw in your sample to a location in the parameter space that better reflects the influence of new data.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Aug 15 at 4:11

























        answered Aug 15 at 2:31









        John ZitoJohn Zito

        9904 silver badges14 bronze badges




        9904 silver badges14 bronze badges






























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Cross Validated!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f422259%2fuse-mcmc-posterior-as-prior-for-future-inference%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Category:9 (number) SubcategoriesMedia in category "9 (number)"Navigation menuUpload mediaGND ID: 4485639-8Library of Congress authority ID: sh85091979ReasonatorScholiaStatistics

            Circuit construction for execution of conditional statements using least significant bitHow are two different registers being used as “control”?How exactly is the stated composite state of the two registers being produced using the $R_zz$ controlled rotations?Efficiently performing controlled rotations in HHLWould this quantum algorithm implementation work?How to prepare a superposed states of odd integers from $1$ to $sqrtN$?Why is this implementation of the order finding algorithm not working?Circuit construction for Hamiltonian simulationHow can I invert the least significant bit of a certain term of a superposed state?Implementing an oracleImplementing a controlled sum operation

            Magento 2 “No Payment Methods” in Admin New OrderHow to integrate Paypal Express Checkout with the Magento APIMagento 1.5 - Sales > Order > edit order and shipping methods disappearAuto Invoice Check/Money Order Payment methodAdd more simple payment methods?Shipping methods not showingWhat should I do to change payment methods if changing the configuration has no effects?1.9 - No Payment Methods showing upMy Payment Methods not Showing for downloadable/virtual product when checkout?Magento2 API to access internal payment methodHow to call an existing payment methods in the registration form?