[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

me/walkthru, status



i'm back on the walkthru wagon now.  met with seth friday
to discuss plans for moving forward.  here's a summary of
outcome from that, plus progress since then.  also a bit
about what i might need from each of you, at the bottom.

top-level tasks i'm working on:

1. getting all of mit into walkthru for viewing, fixing
   any impediments along the way.  this means actually
   cons'ing up a model such that walkthru will load it
   usefully, tho rendering will probably be SLOW at first.

2. enhancing visibility algs in walkthru to accelerate
   viewing for huge, detailed indoor/outdoor models
   (notably, our mit campus model).

basic technical plans for making those start to happen:

1. i obtain sufficient data from mic (DONE - with some minor
   changes requested but non-critical), and from vitaly (not
   needed yet), to test with.

2. i conjure up a walkthru DB for the campus model which is
   composed of:
   - a top level "outdoor" scene that instances two things.
     at first just buildings and later also "outdoor details".
   - individual building DB files that are just like what
     walkthru already supports currently, except for the fact
     that walkthru will be loading them within a larger
     scene context now.

3. i make walkthru deal with that 1 extra level of hierarchy,
   updating all internal details needed to make the whole
   system operate with the new world view.  (not as bad as
   it sounds.  unless you think it sounds easy, in which
   case you should think about that again.)

that was the hand-wavy logical description.  here's a 5 step
implementation plan as i will actually be doing it:

1. "Exterior Queries"
   modify portal traversal alg to handle queries originating
   from a frustum that is outside the scene extents.  this
   alone will not be exercised from the existing walkthru
   prog yet, but is functionality required before stuff below
   can be implemented.  i will put some simple temp hack in
   walkthru to unit test it before going on.

2. "World/Bldg Encapsulation"
   get the extra layer of hierarchy into walkthru so that it
   can recognize a building as a subspace within a world model,
   and [almost] do visibility traversal into and out of the
   bldgs.  this step is basically DB layer mods, to support
   the extra layer of hierarchy in its structs.  again, write
   some hack to unit test it.

3. "N+1 Isolated Queries"
   first vis mods, enabling viewing of multi-bldg scenes.
   for N bldgs in a scene, N+1 initial vis queries will be
   made from any starting or continuing location in the
   exterior world space: 1 for the exterior space (which
   treats bldgs as sealed boxes), and 1 for each bldg.
   the query going into a building treats it as its own
   scene and works "automagically" by simply giving an
   external query to start it (as enabled by <1> above).
   in addition, traversal back out of the bldg scene should
   be transformed and handed back to world space traversal
   as well, being careful to avoid possible circularity
   cases (but that might be better left to correct impl
   within the framework to be provided by <5> below).

4. "N 'Gated' Queries"
   next step up in sophistication from the previous.  enhance
   the "exterior traversal" alg to do bldg occlusion.  issue
   queries going into bldgs which are restricted based on how
   the frustum is already occluded by other bldgs.  (e.g. if
   only one window of a bldg sticks around another, only enter
   that window.  if half a window shows, constraints can be
   synthesized to represent that to the traversal alg as well.
   i know how to do this.)

5. "New Visibility Algs"
   now i can implement everything i described in my AUP paper.
   i won't try summarize any of it here.

writing is good.  i'm not 100% sure of 3/4/5, so seth, check me
on this.  does it still match what you remember?  specifically,
where would you see the handling of vis traversal back out of
a bldg into exterior space starting to work right?  you think
it will fall cleanly out of a decent impl for <3>, or should
it be put off til later?  i think it can happen in <3> but i'm
not fully sure yet just thinking about it at this moment.

so that's it for my own impl plans, as far as things you'd like
to know, and probably a bit extra that you don't.  feel free to
comment at this level especially if you see anything that could
possibly introduce dependencies between us.  i see only a few
such things and will comment on them now, in the form of,
"what i need from you":

mic:  models.  we've talked.  for the most part, things as
  they're being done now are good and i can deal with them.
  i've made a few minor requests for things that make my life
  easier, but nothing important enough to mention here.
  i expect to get some revised data from mic this week but
  am not blocking on it.

vitaly: a model of the outdoor mit campus.  i don't need it yet,
  but maybe by 1-2 weeks from now i will.  (tho i can fake it
  if i don't have it.)  first thing should be just a simple
  terrain with some nice elevation changes.  then maybe some
  random detail objects to make the renderer work harder.
  then eventually a totally realistic representation of
  everything outdoors is of course your goal.

patrick: don't let me drink too much coffee.

so in short, i think i pretty much have everything i NEED
right now, and i'm not depd't on anyone.  i want a few things
as mentioned above, but i can hack around them as needed...
just keep me up to date on where you all are so i know what
data drops i can expect to get and when.

finally, brief report on progress so far (work i did fri/sat).
i took the data mic had which is about 90% of campus and
did a few things with it:

 - i crammed it all into one enormous scene file which lays
   out the buildings roughly in position (courtesy of mic's
   spreadsheet wizardry), but unrotated and scaled down by
   1/2 to try and prevent overlaps.  the big file contains
   a total of 579983 UG face statements.

 - i converted that to an IV file and threw it all at ivview
   on glint (the latest onyx quad with IR4 graphics).

   result: slow load, then about 1 fps perform, always.
   that includes about a second to refresh any time a section
   of the viewer window gets exposed (even if only briefly
   obscured).  this indicates to me that the model is large
   enough to pass the limit where even IR4 hardware can just
   take it and deal in realtime: any exposure or view change
   causing a comparable refresh delay probably means we're
   thrashing from graphics hardware back to main mem and
   becoming i/o bound.  yay!  the assumption that this will
   happen is why i exist here, after all. :)

 - i tried a naive load of the big dataset directly into
   walkthru.  so far, i am still building the DB.  any
   step that requires loading the complete UG file takes
   about 4-5 hours to do the load.  (again, glint, purely
   local disk file i/o.)  for example, running axialsplit
   over the full dataset (which is obviously the wrong
   approach, i know), takes that time to load the UG file,
   and then finishes doing the actual split in less than a
   minute, after which you need to do the wksplit which
   also takes the UG file as input (why?), runs for several
   hours to load it, and performs the split in approximately
   one eyeblink.

 - wkadd's just completed as i write this.  total DB size
   is about 600M.  trying a load into the viewer without
   wkvis'ing first, just to see what happens.  may then
   try wkvis.  (options to make it not take forever doing
   the portal computations we already know will be near-
   useless on this model?)

 - it's been over 10 minutes and the viewer has not yet
   finished startup (but has not crashed).  i presume
   loading and struct init is still in progress, since
   user mem footprint is just passing 600M and still
   growing.

prelim conclusions from the wk tests:

the 90% campus model is still at a scale where walkthru
can handle it just barely even in cases like UG loads,
where it is pushing limits on parts that were presumably
coded less scalably than they could be.  i'll have to
look at those and fix them before going larger however.
there are ways i can actually exhaust user mem space
such as trying to run ugflatten over the whole campus
or wkadd the whole-campus file all at once.  so the
limitation involved there does need to be addressed
directly before scaling up further.  just waiting longer
for completion is not an option past this size.

the actual DB facilities once we're out of UG-land
appear to at least not be dying, but they are definitely
getting pushed past reasonable limits by this brute-force
huge-scene generation.

also, this scene is already past the limits of realtime
for IR4/ivview, so it is a fair first test case, right
at the edge for us.  so i'll continue working with this
dataset as i begin the impl plan outlined above.

i'll wait a few more minutes to see where the viewer
goes with the brute-force db file, but will then switch
to doing things intelligently, including feeding a
"slightly-less-brute-force" full model in to the
existing system to get a baseline measurement that
doesn't take 4 hours per step (using bboxs/hulls/lod
where they are applicable), and beginning the real
work (starting from "1. Exterior Queries" as described
above).

oh, that viewer run i started just got to the point of
displaying the first frame in the main window.  looks
nice, but took a while... precisely 39 min from the time
the program started, and now at 1.0G mem usage.  i bet
i'll have a second frame in another 40 minutes...
IF i wait for it.  so there's our ultra-brute-force
baseline for now: 40 minutes per frame, and a firm
reassurance that brute force will NOT solve this. :)

onward, to doing it the right way...

--pl