Subject: Re: Comments from Tony on Gary's questions
From: Gary Bernstein
Submitted: 10 Jun 2004 15:01:13 -0400
Message number: 224
(previous: 223,
next: 225
up: Index)
In an attempt to resolve as quickly as possible a couple of factors of
3-10 in the WL section, here are some elaborations on the points I
originally made to Michael.
Gary
>
Key:
> < Gary's questions
> Tony's response
Gary's additional
>
> < p47: The argument for resolving 85 galaxies per square arcmin with
> < LSST images is not valid. It states that 70 galaxies per arcmin2
> < were detected without crowding in the HDF simulation; this does not
> < say that these galaxies were usefully resolved. Also there is the
> < implication that 200x10s of 6.9-meter LSST data will resolve all the
> < galaxies that are unresolved in a 3600s 6.5-meter Magellan image with
> < the same seeing. Why should this be true?
>
> I do not believe that 85 galaxies per square arcmin will be usefully
> resolved by LSST in 0.7" seeing, and the DRM should not say so.
> 85 will be detected. We have done a number of simulations based on a
> mosaic of HDF fields. The mosaic is distorted using modest mass
> clusters and then seeing is applied and then it is box averaged
> down to 0.2"/pixel and noise appropriate to 200 LSST 10 sec exposures
> in R or V is applied. Our WL pipeline which eliminates overlaps
> and rejects galaxies whose size is under 1.3 times the PSF is then run
> and the shear and mass maps produced. This is repeated for various seeing
> going from 0.5 to 0.9 arcsec FWHM. This is the basis for the claims
> vs seeing for LSST. As a sanity check we have used the same simulator
> modified to simulate Magellan, Subaru, and the DLS (4m) data and
> the results compare favorably to actual data which was also run through
> the same pipeline. The key is LSST's ability to go to low surface
> brightness for each patch (low f/#). In such a stack in selected
> seeing averaging 0.7" FWHM LSST should usefully resolve 60 galaxies
> per sq.arcmin or a bit more, in V and R separately in 10 years.
>
I remain unconvinced, since no one has yet demonstrated that they are
extracting 60 useful shapes per sq arcmin from ground-based exposures of
any length with any telescope. If this number is used, it should be
qualified as an optimistic goal, not a demonstrated expectation.
>
>
> < p49: "must be stable on arcminute scales at the 1\% level during an
> < individual exposure"
> < -> "must be stable on arcminute scales at the 0.1\% level over the
> < timescale of several exposures."
> < Here's why: If "shear floor" on stack of 200 images needs to be
> < 0.0001 in a stack of 100x2x10s exposures, and we reduce by sqrt(100)
> < the systematic shear by averaging over the 100 epochs, then
> < systematic power must be 0.001 or lower in each epoch. Must be
> < stable over several exposures if we are to combine the info from many
> < exposures to diagnose & remove this systematic that is below the
> < star-separation scale.
>
> There are separate specs for precision in the processed stack and
> stability during an exposure. Your argument is correct if applied to
> the ultimate precision, but the spec in question is rather the stability
> during an exposure. The reason stability on the timescale of an
> exposure and on arcminute scales is important is because we use stars
> in the field to correct for PSF ellipticity, and that must be suitably
> stable on those spatial and timescales. Right now in our 4m data it
> is stable at the 10% level and we are requiring conservatively 1%
> stability for LSST. We currently do better than a factor of 10
> rounding of PSFs, per exposure, and I would expect that would be even
> better by the time of LSST.
>
>
The sqrt(100) factor in my calculation is the best-case scenario for how
the single-image spec translates into a result on the stack. And
"PSF-rounding" on the individual images cannot give improvement on
scales of variation that are smaller than the inter-star spacing in the
individual images, so there is no factor-of-10 reduction in
single-exposure systematics possible on these small scales. Hence my
conclusion that each single exposure needs to be within a factor of
sqrt(100) times the ultimate systematics goal of 0.0002. Hence 0.002
should be the upper limit on sub-arcminute systematics that vary on time
scales of 20s or so.
LSST LSST LSST LSST LSST Mailing List Server LSST LSST LSST LSST LSST LSST
LSST
LSST This is message 224 in the lsst-general archive, URL
LSST http://www.astro.princeton.edu/~dss/LSST/lsst-general/msg.224.html
LSST http://www.astro.princeton.edu/cgi-bin/LSSTmailinglists.pl/show_subscription?list=lsst-general
LSST The index is at http://www.astro.princeton.edu/~dss/LSST/lsst-general/INDEX.html
LSST To join/leave the list, send mail to lsst-request@astro.princeton.edu
LSST To post a message, mail it to lsst-general@astro.princeton.edu
LSST
LSST LSST LSST LSST LSST LSST LSST LSST LSST LSST LSST LSST LSST LSST LSST LSST