Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.stsci.edu/ops/tof/tof_minutes/04-29-99_brainstorming
Дата изменения: Fri Apr 30 23:22:10 1999
Дата индексирования: Sat Mar 1 16:34:17 2014
Кодировка:

Поисковые слова: storm
Brain storming session from 4/29/99 TOF meeting
-----------------------------------------------

(Note from editor: I typed up my notes and then rearranged a little to
group concepts. Hope I haven't mangled anything to badly. Let me know
if you have any comments on the summary I made at the bottom.)


Visualization
-------------

I want to see the FOV on the sky (VTT seems to do this well).

Want DSS or user supplied image with the FOV displayed and the ability
to rotate to explore orients. It should show which are the guide stars
and tell me the coordinates of where my mouse is pointed.

Want to be able to centroid for an objects coordinates within the
tool.

How your observations are placed on the sky is an important issue and
our current tools are not suitable. PIs want to be able to mosaic
without having to limit scheduling with an Orient. It would be good to
see all the options and then be able to choose which are acceptable.

Our tools need to be "accurate enough". For instance what about
geometric distortion? This will be important for ACS. What about
distortion of the input image? This would imply that our metric tools
need to be integrated into the package.

The user should not have to worry about things that can be learned
from the image header. For instance, is the image pre or post COSTAR?
The software can get this from the header and display accordingly.

To summarize, something like the VTT is a very important improvement.


Access to all constraints
-------------------------

I don't like finding out about problems with guide stars several weeks
after submission. I want the bad news up front so I can make the
choice to change the orient or use single guide star guiding.

The new instruments will have more bright object concerns. That
information needs to be given to the user while constructing the Phase
2 and not several weeks after submission.


ETC
---

It would be nice to have a button in the tool that accessed the ETC
based on the information in the Phase 2. Trouble is that other
information is needed and we don't want to force everyone to supply
this extra information in the Phase 2.

But what if the tool accesses that information from a file that was
saved from the Phase 1 ETC work? This would only be lost if the PI
deleted the files.

I want a minimalist approach. There are risks in a fancy system. It
is hard enough to trust the ETC results now. Not necessarily because I
think the software is flawed, but I worry that I misunderstood one of
the questions that the system asked. I like to play around until I
convince myself that I understand the nuts and bolts and this raises
my confidence in the tool as well as my understanding of the tool.

Most interesting targets are not simple cases. You might not know the
red shift well or how extended the target really is. So then you use
the tools to get a feel for the order of magnitude given various
limits and make the final decision with your gut.

A tool can give answers to well formed questions, but ultimately the
users have to (and want to) choose for themselves. Sometimes they make
hunches based on previous experience.

It would be great to have a tool that spits ETC results into Phase 2
format, but I would not want a system that is a black box and writes
the whole Phase 2 for me.

I want to see the intermediate information, not just the final answer.
Sometimes I use the intermediate information and ignore the final
answer.

On the other hand, there are processes that are so deterministic that
it is perfectly acceptable to do all the work for me. An example is
the ETC for STIS acquisition. The Institute has determined that a S/N
of 40 is required for a successful acquisition and that's what it
gives you.

The concepts built into the current SEA ETC seem to be the about as
far as you can go.

One thing that it would be nice for the system to have built in is a
library of template spectra so I don't have to go find them. There are
lots of nice energy distributions for stars, but what about all the
different galaxy models?

Phase 1
-------

Phase 1 isn't too bad as it is, the web tools are fine for that
purpose, most of my time is spent in Phase 2.

People would prefer not to do too much work in Phase 1 since they most
likely will not be granted time. But if there was a way to find out
without too much extra work that something is infeasible - that would
be a good thing.

When I am doing a proposal that is a survey of say 10 targets, I don't
figure out exactly how much time each target will take. I figure out
the average properties of my targets and figure how much time the
average target will take and multiply by 10. This is a reasonable
approximation and reduces the work.

The Phase 2 tools may be fun (or even helpful) for the more
experienced and particular Phase 1 proposer, but Phase 1 should not
drive the development of Phase 2 tools.

The greatest need for tools is in Phase 2, not Phase 1.

The TAC panel likes to read the arguments made by the PI in justifying
his time request. If the software just popped out a number there is
nothing to help the panel understand the request.

The Phase 2 tool could have a Phase 1 mode that is simplified.

A good Phase 1 tool could replace much of the text in the CP
(formulas, charts, rules). The tedium of following the paper
instructions lead to PI mistakes in overheads.

But having every proposal run a Phase 2 lite in Phase 1 is more work
for the proposer (maybe we don't care about that) but also more work
for the Institute to support the questions about the software and
maintain the rules in the software. (Right now the whole technical
review of all the Phase 1s only takes a couple of days.)

In Phase 1 I ask for an allocation that will work. In Phase 2 I
optimize within the allocation that I have been given.

There is a good reason to continue to have two phases: the 5-1
ratio of acceptance.

But there is only 1 phase with most x-ray observatories.

But there are less knobs to tweak in an x-ray observation.

One ETC that combines all the instruments and documents would help the
user explore more quickly and make fewer mistakes. It should have a
modest, clean interface w/o a lot of Phase 2 overhead.



Documentation
-------------

The instrument scientists spend lots of time documenting features. On
the one had we want all the documentation in and around the tools. On
the other hand we want to be able to read in English about the issues,
limitations, and capabilities.

The information that the instrument scientists generate should be able
to be used by software.

But many statements have caveats.

But a tool can throw up the caveats.

It doesn't seem practical to keep all this up to date. There would be
lots of coordination issues.

When the documentation changes it will need to be explained to the PIs.
But we have to do that now.


Answering questions
-------------------

I like doing all the details in Phase 2 myself, but not all users want
to be bothered with the details.

Could the PIs use a tool to help them select targets (for instance:
Which of these targets are in the CVZ the longest?).

In ground based observing it is necessary to ask, what on this list
will be available in May?. But this question is not relevant to HST
because all pointing are available. (The CVZ is an exception.) HST
users really don't choose their targets this way.

How about access into NED through the tool?

Not clear that this would be useful since everyone's needs are
different for querying NED.

Let's stay away from special cases and concentrate on the cases that
affect a lot of users. Example: Overheads. The system should tell you
what all the overheads are and what they depend on.

But if the overheads are being accurately calculated, why bother?

Ah but there are strategy issues. An 180S exposure incurs more
overhead than a 160S exposure. If I know this and there is no
scientific reason to make the exposures 180S, I could save a lot of
overhead.

What if the software concentrated on tips like this instead of dumping
all the overhead information?

The bottom line is that RPS2 is a black box to the use especially with
respect to overheads.

I would like a slider bar for filling my orbit and then I could see
the jump in overhead when I reached a certain boundary.

How about an optimizer that fills the orbit for you? Or even rearranges
things to optimize buffer dumps?

Scientists do not like black boxes. They want to understand and they
do not want to waste time on trial and error optimization. (Need a
"white box" optimizer that tells them why something was done.)

It's a pain to have to take a visit out of the program to optimize it
because the software insists on working on the entire program.

The trial and error of RPS2 is the heart of the RPS2 speed issue.

TransVerse will optimize better.

I would like better/faster access to data about what a change in SCHED
or ORIENT will do for my schedulability. Maybe a slider bar.

Some constraints are interlocked with others and full processing
is required. But for those constraints that are not interlocked, why
should all the other processing be redone to show me the relationship
between the constraint and overall schedulability?

There are reports that PCs can make for a PI such as what orients
have guide stars for a particular observation. It does not have to
be iterative. We should compile a list of all the things PCs can
make for a PI. Maybe have a brainstorming session with the PCs.


Summary of concepts:
--------------------

1) Integrated visualization tool is an excellent idea.

2) Integrated single exposure time calculator would be useful in both
Phase 1 and 2.

3) Don't make PIs do a full Phase 2 in Phase 1.

4) Don't let Phase 1 needs drive Phase 2 tool design, but if the Phase
2 tool had Phase 1 utility that's a plus.

5) Give the PIs all the scheduling constraints, they want the bad news
up front while they are still working on the proposal.

6) Allow PI visibility into system - black boxes are hard to trust.

7) Don't make PI iterate or ask PC to answer standard questions.

8) Integrating the documentation with the software should be explored.