[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Meeting Friday



Inline ------->

Quoting Ben Vandiver <vandiver@gmail.com>:

> As "usual" (?) we'll be having a meeting on Friday at 1pm.
> 
> Emily and I have been working on understanding where the missing link
> is between the rest of the pipeline (which ends in walls) and the
> BaseGen code.  We've managed to discover where most of the inputs to
> BaseGen come from, but it appears to utilize a directory full of files
> (mit/src/BaseGen/Input/Bldg) which we cannot discover the source of. 
> These contain transformed portal data such as almost output by walls
> (it's not transformed coming out of walls).  How are these files
> produced?  Is this the missing link that needs to be added to include
> the BaseGen code into the overall pipeline?

--------->
It is actually pretty simple. Mic is the right person to ask. The idea is that
he gives me those file in the Bldg directory and I then I pick them up and use
them in my applications. I do not know how exactly he produces those files; I
have always been under impression that they come out of "walls", but Michael
should know for sure; and yes, I guess this is the most important link between
the BaseGen and the rest of the pipeline. As far as I remember, the Bldg
directory contains transformed building contours that I "cut out" from the
basemap to ensure that building models can be assembled properly with the 3D
basemap model.
 
> So Emily has been thinking about how apply Vitaly's fixes to the
> building transforms to the "spreadsheet".  From looking through the
> code, we think that this means adding some code around the line in
> pipeline.py that reads "# FIXME insert vitaly's transforms".  In order
> to do so, Emily needs to write some Python code that reads
> run/input/mit_building_corrections.xml and applies these deltas to the
> "buildings" list produced by strip_spreadsheet.  The plan is for Emily
> to work on understanding XML parsing in Python for Friday, and then
> work on applying it to solve the given problem.

-------->
My application, which allows you to correct contours after they have been
computed applying transfroms in that bogus .EXL (I guess it is Excel and not
.XML file), gives (1) the corrected set of transforms, and (2) the set of
transformations (translation, scale, and rotation) that need to be applied in
addition to those from the .EXL file (in both cases, they are given per
building). Therefore, what Emily ultimately needs to do is to read transfroms
from the .EXL file and "multiply" them with those produced by my application.
In fact, you even do not need to automate this process, because as soon as you
have the right set of transforms, you can give them back to the DOF guys and
they will be happy to have it. If they decide to add another building at some
point, it will be their responsibility to make sure that new transforms are
correct, because if the transfroms are not correct, we will have to match the
new contours manually again anyways. 

> Ben's hacking on getting the Java SDK installed on Glint.  Michael, is
> it already installed?  If so, I can't find it.  I, however, need root
> in order to complete the install.  If you have root and want to do it
> yourself, there are a pair of tarballs in /scratch/benmv/ which should
> do the trick.  If not, let me know the root password in some secure
> manner and I'll take care of it.
> 
> -Ben
> 

Vitaliy