# Updated Performance-based Fisher optimization for CMB-S4

As we are gearing up for increasingly more complex data challenges, we want to expand the set of available frequencies in order to explore any potential benefit to parameter constraints. As with DC1.0, the next version of the data challenge maps need to be informed by a thorough performance-based optimization (for the small S4 survey, this is an optimization over $$\sigma_r$$).

On March 31st 2017, the Forecasting group had agreed on new band definitions which have been documented in this posting.

In this posting I optimize for $$\sigma_r$$ given these new band definitions. The framework that I use is entirely equivalent to the perfromance based framework described in this posting, which was used for the Science Book Inflation forecasts.

## 1. Worked-out Example; Experiment Specification

Below, similar to Section 2 and 3 of the posting linked above, I present an application of this framework to an optimization grounded in achieved performance.

• For this particular example I assume eleven S4 channels: {10, 15, 20, 30, 40, 85, 95, 145, 155, 215, 270} GHz, two WMAP channels: {23, 33} GHz and seven Planck channels: {30, 44, 70,100, 143, 217, 353} GHz.

• This example assumes BICEP3-size apertures, and scales the beams accordingly. Note: This is not true for the 10, 15, 20 GHz channels, which assume a beam equivalent to the beam at 30 GHz (i.e we assume that the aperture scales by the required amount to keep the beam fixed to the one at 30).

• The assumed unit of effort is equivalent to 500 det-yrs at 150 GHz. For other channels, the number of detectors is calculated as $$n_{det,150}\times \left(\frac{\nu}{150}\right)^2$$, i.e. assuming comparable focal plane area. The projections run out to a total of 5,000,000 det-yrs (5,000,000 det-yrs, if all at 150 GHz, would be equivalent to 500,000 detectors operating for 10 yrs -- this seems like a comfortable upper bound for what might be conceivable for S4. S4 scale surveys seem likely to be in the range of $$10^6$$ to $$2.5\times10^6$$ det-yrs).

• I first want to emphasize that the NET numbers that follow are only used to determine the scalings between different channels, and NOT to calculate sensitivities. All sensitivities are based on achieved performance. The ideal NET's per detector are assumed to be {234, 228, 214, 177, 224, 270, 238, 309, 331, 747, 1281} $$\mu\mathrm{K}_\mathrm{CMB}^2 \sqrt{s}$$. This is the last column of the table in the Band Definition posting. Note: These updated NET's are calculated for a 100mK bath, as opposed to 250mK before, and are therefore lower than before.

• The BPWF's, ell-binning, and ell-range are assumed to be l=[30,330]; yielding 9 bins with nominal centers at ell of {37.5, 72.5, 107.5, 142.5, 177.5, 212.5, 247.5, 282.5, 317.5}.

• The Fiducial Model for the Fisher forecasting is centered at $$r$$ of 0, with $$A_{dust} = 4.25$$ (best-fit value from BK14) and $$A_{sync}=3.8$$ (95% upper limit from BK14). The spatial and frequency spectral indeces are centered at $$\beta_{dust}=1.59, \beta_{sync}=-3.10, \alpha_{dust}=-0.42, \alpha_{sync}=-0.6$$, and the dust/sync correlation is centered at $$\epsilon=0$$. I also introduce $$\delta_{dust}$$ -- a dust decorrelation parameter (that is always ON), and $$\delta_{sync}$$ -- a sync decorrelation parameter (that is toggled ON and OFF). The dust decorrelation parametrization is exactly as described in Section 5 of the initial optimization posting. The synchrotron decorrelation parameter is being introduced here for the first time, and has the same frequency and spatial form as the dust decorrelation parameter, but is normalized at (23GHz, 33GHz, l=80). While this parameter is let to freely vary in the Fisher optimization, I do center it at zero, given that we have no good information on this value.

• The Fisher matrix is 10-dimensional. The 10 parameters we are constraining are: {$$r, A_{dust}, \beta_{dust}, \alpha_{dust}, A_{sync}, \beta_{sync}, \alpha_{sync}, \epsilon, \delta_{dust}, \delta_{sync}$$}. Where $$\beta_{dust}$$ and $$\beta_{sync}$$ have Gaussian priors of $$0.11, 0.30$$, and the rest have flat priors.

• As before, I implement delensing as an extra band in the optimization. See the description underneath Table 1 in this posting for a more in depth description on how this is done.

## 2. Parameter Constraints; $$\sigma_r$$ performance

Table 1:
This table offers a summary of the pager above for the Sync Decorrelation "On" setting, at 1M det-yrs (150 equiv).

One can note that the inclusion of Window 0 sees the formal optimal solution change by dedicating some effort to Window 0, and at the same time changing the effort distribution in the other Widdows. Including Window 0 improves $$\sigma_r$$ by roughly 20% in almost all cases.
$$f_{sky}$$Windows IncludedEffort in $$\sigma_r$$ ($$\times 10^{-3}$$)Effective $$A_L$$Path
$$W_0$$$$W_1$$$$W_2$$ $$W_3$$ $$W_4$$ $$W_{DL}$$
0.01 W0-4 1.0% 10.0% 31.5% 13.5% 8.75% 35.25% 0.56 0.048 Optimal
0.60 0.048 Split
W1-4 --- 11.25% 29.0% 19.0% 11.25% 30.0% 0.69 0.052 Optimal
0.71 0.052 Split
0.03 W0-4 0.5% 5.0% 37.5% 13.5% 11.5% 32.0% 0.65 0.088 Optimal
0.70 0.088 Split
W1-4 --- 11.25% 34.75% 16.0% 11.5% 27.0% 0.79 0.096 Optimal
0.84 0.096 Split
0.10 W0-4 3.0% 2.5% 41.5% 15.0% 11.5% 26.5% 0.78 0.18 Optimal
0.86 0.18 Split
W1-4 --- 8.5% 39.0% 15.0% 14.0% 23.5% 0.95 0.19 Optimal
1.05 0.19 Split

## 3. To come soon:

• Fixed levels of delensing for ease of comparison with other groups.
• Varying levels of foregrounds for the higher $$f_{sky}$$ choices.
• Noise fits for the next version of the Data Challenge.