<bdo id='mdKVx'></bdo><ul id='mdKVx'></ul>

      <small id='mdKVx'></small><noframes id='mdKVx'>

    1. <legend id='mdKVx'><style id='mdKVx'><dir id='mdKVx'><q id='mdKVx'></q></dir></style></legend>

        <tfoot id='mdKVx'></tfoot>
      1. <i id='mdKVx'><tr id='mdKVx'><dt id='mdKVx'><q id='mdKVx'><span id='mdKVx'><b id='mdKVx'><form id='mdKVx'><ins id='mdKVx'></ins><ul id='mdKVx'></ul><sub id='mdKVx'></sub></form><legend id='mdKVx'></legend><bdo id='mdKVx'><pre id='mdKVx'><center id='mdKVx'></center></pre></bdo></b><th id='mdKVx'></th></span></q></dt></tr></i><div id='mdKVx'><tfoot id='mdKVx'></tfoot><dl id='mdKVx'><fieldset id='mdKVx'></fieldset></dl></div>

        带变点的PyMC3回归

        时间:2024-08-22
        <tfoot id='YO5DW'></tfoot>
            <tbody id='YO5DW'></tbody>

          • <small id='YO5DW'></small><noframes id='YO5DW'>

            • <bdo id='YO5DW'></bdo><ul id='YO5DW'></ul>
                <legend id='YO5DW'><style id='YO5DW'><dir id='YO5DW'><q id='YO5DW'></q></dir></style></legend>

                <i id='YO5DW'><tr id='YO5DW'><dt id='YO5DW'><q id='YO5DW'><span id='YO5DW'><b id='YO5DW'><form id='YO5DW'><ins id='YO5DW'></ins><ul id='YO5DW'></ul><sub id='YO5DW'></sub></form><legend id='YO5DW'></legend><bdo id='YO5DW'><pre id='YO5DW'><center id='YO5DW'></center></pre></bdo></b><th id='YO5DW'></th></span></q></dt></tr></i><div id='YO5DW'><tfoot id='YO5DW'></tfoot><dl id='YO5DW'><fieldset id='YO5DW'></fieldset></dl></div>

                  本文介绍了带变点的PyMC3回归的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                  问题描述

                  我看到了如何使用pymc3进行变点分析的示例,但似乎遗漏了一些东西,因为我得到的结果与真实值相差甚远。这里有一个玩具示例。

                  数据:

                  脚本:

                  from pymc3 import *
                  from numpy.random import uniform, normal
                  
                  bp_u = 30 #switch point
                  c_u = [1, -1] #intercepts before and after switch point
                  beta_u = [0, -0.02]  #slopes before & after switch point
                  
                  x = uniform(0,90, 200)
                  
                  y = (x < bp_u)*(c_u[0]+beta_u[0]*x) + (x >= bp_u)*(c_u[1]+beta_u[1]*x) + normal(0,0.1,200)
                  
                  with Model() as sw_model:
                  
                      sigma = HalfCauchy('sigma', beta=10, testval=1.)
                  
                      switchpoint = Uniform('switchpoint', lower=x.min(), upper=x.max(), testval=45)
                  
                      # Priors for pre- and post-switch intercepts and slopes
                      intercept_u1 = Uniform('Intercept_u1', lower=-10, upper=10)
                      intercept_u2 = Uniform('Intercept_u2', lower=-10, upper=10)
                      x_coeff_u1 = Normal('x_u1', 0, sd=20)
                      x_coeff_u2 = Normal('x_u2', 0, sd=20)
                  
                      intercept = switch(switchpoint < x, intercept_u1, intercept_u2)
                      x_coeff = switch(switchpoint < x, x_coeff_u1, x_coeff_u2)
                  
                      likelihood = Normal('y', mu=intercept + x_coeff * x, sd=sigma, observed=y)
                  
                      start = find_MAP() 
                  
                  with sw_model:
                      step1 = NUTS([intercept_u1, intercept_u2, x_coeff_u1, x_coeff_u2])
                      step2 = NUTS([switchpoint])
                  
                      trace = sample(2000, step=[step1, step2], start=start, progressbar=True)
                  

                  结果如下:

                  如您所见,它们与初始值有很大不同。我做错了什么?

                  推荐答案

                  最终看来,使用Metropolis采样切换到离散断点可以解决问题。以下是最终模型:

                  with Model() as sw_model:
                  
                      sigma = HalfCauchy('sigma', beta=10, testval=1.)
                  
                      switchpoint = DiscreteUniform('switchpoint', lower=0, upper=90, testval=45)
                  
                      # Priors for pre- and post-switch intercepts and slopes
                      intercept_u1 = Uniform('Intercept_u1', lower=-10, upper=10, testval = 0)
                      intercept_u2 = Uniform('Intercept_u2', lower=-10, upper=10, testval = 0)
                      x_coeff_u1 = Normal('x_u1', 0, sd=20)
                      x_coeff_u2 = Normal('x_u2', 0, sd=20)
                  
                      intercept = switch(switchpoint < x, intercept_u1, intercept_u2)
                      x_coeff = switch(switchpoint < x, x_coeff_u1, x_coeff_u2)
                  
                      likelihood = Normal('y', mu=intercept + x_coeff * x, sd=sigma, observed=y)
                  
                      start = find_MAP() 
                  
                      step1 = NUTS([intercept_u1, intercept_u2, x_coeff_u1, x_coeff_u2])
                      step2 = Metropolis([switchpoint])
                  
                      trace = sample(20000, step=[step1, step2], start=start, njobs=4,progressbar=True)
                  

                  这篇关于带变点的PyMC3回归的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                  上一篇:Tkinter选项菜单将命令添加到多个选项菜单 下一篇:如何将神经网络的输出限制在特定的范围内?

                  相关文章

                  • <bdo id='Fdasy'></bdo><ul id='Fdasy'></ul>
                  <legend id='Fdasy'><style id='Fdasy'><dir id='Fdasy'><q id='Fdasy'></q></dir></style></legend>

                      <small id='Fdasy'></small><noframes id='Fdasy'>

                      <i id='Fdasy'><tr id='Fdasy'><dt id='Fdasy'><q id='Fdasy'><span id='Fdasy'><b id='Fdasy'><form id='Fdasy'><ins id='Fdasy'></ins><ul id='Fdasy'></ul><sub id='Fdasy'></sub></form><legend id='Fdasy'></legend><bdo id='Fdasy'><pre id='Fdasy'><center id='Fdasy'></center></pre></bdo></b><th id='Fdasy'></th></span></q></dt></tr></i><div id='Fdasy'><tfoot id='Fdasy'></tfoot><dl id='Fdasy'><fieldset id='Fdasy'></fieldset></dl></div>
                    1. <tfoot id='Fdasy'></tfoot>