// Context here: https://docs.google.com/document/d/119_ADMYWSBdzAdqXMuKepphpQqZxSY7yU7DDCYjD-sY/edit //Efficacy percent_aerosol = 5% to 50% // percentage of pandemic deaths caused by aerosol transmission aerosol_reduction = 20% to 40% // reduction in aerosol transmission by far-UVC efficacy_gears_level = percent_aerosol*aerosol_reduction // this seems too high, so I'm using 1% for now efficacy = 1% //Baseline
@startClosed @name("Documentation: Start Here!") documentation = "This model contains the information necessary to support a forecasting competition to predict the ratings of upcoming movies on IMDB, Metacritic, and Rotten Tomatoes. To participate in the competition, write a function matching this signature: ``` fn( time: Date between 2024-04-01 and 2024-06-01, movieUrl: Metacritic movie ID like \"boy-kills-world\" or \"challengers\", scoreType: One of [\"imdb\", \"metacritic\", \"rottenTomatoes\"]
/* Modeling the annual probability of cooling events from volcanic eruptions */ // See the sister model which focuses on severity of volcanic cooling events here: https://squigglehub.org/models/ASRS-Resilience/Volcanic-Cooling-Events // Method 1 uses modeled temperature change data from Stoffel et al. (2015) https://dendrolab.ch/wp-content/uploads/2018/10/Stoffel_etal_NGEO_2015.pdf // The paper models non-tropical cooling over land. This is often more extreme than overall land cooling, which is what we are modeling, so a discount is applied to account for the increased sensitivity
/* Modeling the probability and severity of volcanic cooling events */ // Method 1: Based on the assumption that (A) 1+ degree cooling events happen with a period of 150 years, (B) 1.5+ degree cooling events happen with a period of 500 years, and (C) 2+ degree cooling events happen with a period of 3000 years [Source: Fig. 1d from Stoffel et al. (2015): https://dendrolab.ch/wp-content/uploads/2018/10/Stoffel_etal_NGEO_2015.pdf] // Construct four models, two exponential and two power-law, and use a mixture of all four models. // Incorporate uncertainty into the relative frequency of 1, 1.5 and 2-degree cooling events. 1.5-degree cooling events represent approximately 30% of 1-degree cooling events, and 2-degree cooling events represent approximately 5% of 1-degree cooling events. I intruduce a subjective amount of uncertainty to these figures such that the 95th percentile is approximately 2x the mean:
/* Modeling the annual probability of cooling events from nuclear winter */ // See this sister model for a fuller explanation for how we determine inputs for soot, cooling etc. : https://squigglehub.org/models/StanP/Modeling-Nuclear-Winter-with-Uncertainty // NUCLEAR WINTER // The annual probability of a nuclear conflict with 100+ detonations. Our CEA estimates this at 0.12%. The three main sub-estimates were 0.06%, 0.10% and 0.13, so I use a Beta distribution with mean 0.12% and with a 90% confidence interval of approximately (0.01%, 0.34%) that includes these sub-estimates annual_prob_100plus_detonations = beta(1.224,998.776)
/* Describe your code here */ valuation_data = { qaly: 250k, qaly_percent: 0.71 to 0.84 } @hide
/* Modeling the global cooling rate expected from a large-scale nuclear exchange */ // Using the competing Reisner and Robock claims on the soot injection of a regional nuclear exchange as a basis, this model extrapolates to estimate the level of cooling expected from a nuclear war of at least 100 detonations. // Black carbon (known as BC or soot) in stratosphere from 100x15kT-warhead nuclear exchange (Tg), all strikes countervalue. Use the Robock estimate of 5Tg as the 95th percentile. Reisner estimates 0.2Tg, but claims this is an overestimate since they assume all combustible material is converted to BC, while the true amount would be 10-100 times less. Denkenberger & Pearce (2018) models the soot-emission factor as lognormally-distributed with 90% CI (1%,4%), which has mean value 2.2%. Hence our lower estimate, which we use as the 5th percentile, is 0.2Tg * 0.022 = 0.0044Tg bc_min_all_countervalue = 0.0044 to 5
// How many dollars can be saved, if the problem is totally solved problem_size = lognormal(log(100), log(10)) + // journalists lognormal(log(1000), log(10)) + // news agencies lognormal(log(10), log(10)) // individuals // Chance that we manage to implement a deeper and higher-quality search better_tech = bernoulli(0.1) // Number of months till the feature is copied monopoly_time = lognormal(log(6), log(4))