How to input LIDAR data in ArcGIS

While I have been so problematic before how to resample my billions of points to have it visualized in 3D and further perform analysis over it, I found out that ArcGIS would be the best solution.

The usual discretization is performed on my data, I scraped out all the noisy points by limiting my range to 5cm o 100m. Now I added another criterion, that is to remove all points below my LIDAR setup. That is to consider theta(horizontal FOV)<90 and theta >270 degrees. That in a way reduced my point cloud into almost 30% of the original. Memory-wise, it was a LUXURY already!

Now for checking the irregularities in the observed point cloud, using the Point File Information Tool is by far the best solution. Using ArcGIS 10.0:

1. Go to 3D analayst tools>Conversion>From File> Point File Information.
2. You will be opt to browse lidar point cloud by File or by Folder. (I suggest that you browse by Folder if you've got more than 10 files.)
3. Now you will be opt for the file format. Since what I did is to manually derive the point cloud from scratch and most of my intensity returns are not so usable due to our observation time at night, I opted using XYZ format instead. The binary LAS format is the mainstream interchange format which is optimized and I guess what's available from available vendors out there. Anyways..
4. Specify the suffix (file extension)  and Coordinate System in case you have any.

 Now I have tested using just the first 200 rotations for checking my point file information.

What the process will give you are the bounding rectangle, the attribute table and the average point spacing contained within one of the attributes. This average point spacing is crucial when you are to load and visualize the point cloud later.

So right clicking the Pt_Spacing field in the attribute table, check the statistics to get the average (MEAN) point spacing.

Now you get to clean your data by looking at the average point spacing, any file that goes too large or too small may be erroneous due to incorrect sampling hence can already be deleted. It's also wise to check on the bounding rectangles to check for inconsistencies.

Next will be the loading of the LIDAR data.

My experience is on the ASCII format xyz data. Here are the steps.
1. > 3D Analyst Tools>Conversion>Ascii 3D to feature class. (Input average point spacing as specified in the Point File Information results. Specify coordinate system, input file format etc. if there's any)
2.  To create a DEM out of your Multipoints, just use >Conversion Tools>To Raster>Points to Raster
      - For the Value Field, use Shape.Z
     -  For the cell assignment type, use MEAN
     -  For the cellsize, use average point spacing *4
3. Export data to GRID or Tif to make your data permanent.

Ok. so now, for my next problem, I found out that my latitude and longitude are being interchanged in plotting. This is quite easy to solve in Matlab through:

M(:,[1, 2, 3])=M(:,[2, 1, 3]);

The matrix M contains columns 1,2,3 as lat,long,ht respectively, what I did is to interchange the columns to plot my cloud of points seamlessly in ArcGIS(ArcMap and ArcScene). Nice results so far.

Here's my sample 200 rotations, meaning 20 second Velodyne LIDAR data. The red triangles are points observed through GPS as traversed by the lidar.

Now my next step is to test the data in a larger scale, I mean for my entire dataset. Let's see what kind of problem will arise next...


Some Matlab scripting and coordinate axes glitches on my codes

I had some problems using my previous Matlab script on generating the converted latitude, longitude and height from the ECEF -X,Y,Z coordinates.

for k = 1:5
    matFilename = sprintf('%d.csv', k);
    M = ecef2lla(textread(matFilename));
    dlmwrite('sample.xyz',M, '-append');

What happens right here is that it writes formatted values of the latitude, longitude and height formatted in three columsn but in 5 significant digits only hence creating duplicated coordinates which scraped up most of my values in the sample.xyz. It took me the whole long weekend to figure out how to improve the results, I know it's on the formatting only and  not on the predefined classes per se. 

As I've always been familiar with the %3.2f tag for specifying the decimal places in a float, I have been trying all the way to fit in my argument but to no avail. I'm almost giving up on using the "dlmwrite" function and tried using the "fprintf" which accepts the %X.Xf argument, however the matrices that used to be formatted in three columns are turning out to be  printed in a single queue. So what now? I think I'd be having a hard time if I'd re-read and parse my final single-columned result to make it appear like three-columned file. So I tried going back to the 'dlmwite()' function. This forced me to read the entire dlmwrite code and Alas! I found out the 'precision' argument...ahhah! Now, my script is working perfectly just the way I want it.

for k = 1:500
    matFilename = sprintf('%d.csv', k);
    M = ecef2lla(textread(matFilename));
    dlmwrite('sample.xyz',M, '-append', 'precision',10);

This simple Matlab code reads all the *.csv files within the folder where the matlab code was saved, applies the conversion from the earth-centered-earth-fixed X, Y,Z values through the ecef2lla function to get the latitude, longitude and height above the WGS-84 ellipsoid of the equivalent points then writes the result in the 'sample.xyz' text file. The converted coordinates for every new filename was appended to the matrix of coordinates. Now solving my previous formating problem, I specified the 'precision' argument then the value '10' to write a the formatted float values to 10 significant figures. Yey!

Now my next step is to plot my generated cloud of points in ArcGIS. And luckily, I have another problem to figure out, I did something wrong with the orientation of my velodyne LIDAR points with that of the ECEF coordinates.

Here's how it appeared in plan view using ArcMap. The orientation of the axis has been messed up. 

And here's how it looks in perspective using ArcScene.

 Panning and rotating the views, the real orientation should have been this way.

Now I need to try all the possible permutations of the X,Y,Z... that would only be six so I am now up to my trial and error mode. Hopefully, I'd finish this and my next move is to port my Matlab ecef2lla function to python so I won't be cluttering all my processing steps. ;)


Minimizing my lidar velodyne point cloud

I think I should be reminding myself how I did the timestamping of the point cloud data with that of the GPS.
Here are the seemingly easy steps.

  1. I converted the GPS data from geographic lat, long, ht to an ECEF(Earth-centered-earth-fixed) cartesian coordinate system using this formula: The reason for conversion into cartesian is so that I can easily do the translation from our velodyne device to the GPS antenna. I used the WGS84 reference ellipsoid parameters for conversion. a = 6,378,137 m , 1/f= 6,356,752.3142
  2.  Now after I got all my coordinates in X, Y, Z, I matched the timestamps in the LIDAR packets as that of the timestamp provided by the GPS.
  3. I then performed some translation based on the structure of the vehicle mount that we have fabricated for the fieldwork.
  4. Now my data is in X,Y,Z and ready for plotting in a point cloud software.
When I'm finally down with the timestamping of the LIDAR with GPS, I am now overwhelmed by the number of points generated from pcap to csv. For every rotation, a csv file was generated. Since it spins at 10Hz, there was like, 10 files per second. I decided to just scrape off the other 9 and use every 10th csv generated.

Now for some sort of discretization, I performed some thresholding of my LIDAR to point values to the specified usable range, 5cm to 100m.

I am now left with approximately 20% of my original data. Seemed like it's gonna work this time.

The next problem is how to fit my data with the baseline data and overlay them in the same coordinate system in my mapping software?

I have tried creating a Matlab script that would read my csv files with coordinates in ECEF-XYZ coordinate system and convert it to latitude longitude again. All the matrices of converted coordinates are appended in a single XYZ file for plotting later.

Here's the matlab script that I've created

for k = 1:5
    matFilename = sprintf('%d.csv', k);
    M = ecef2lla(textread(matFilename));
    dlmwrite('sample.xyz',M, '-append');
That's it, I never thought that could've been so simple. Thanks to the predefined classes and functions in Matlab. I think, I would be using it more often.:)

Now I will just try to sort my data and remove some redundant ones. So what's next? Tomorrow, I'll have to try the fitting of the DEM(digiral elevation models). My next task is how to convert my topo maps in Autocad dwg format to contours to create a DEM. I think this is pretty much easier. I think, I hope...Let's see.(^_^)


What's keeping me so busy right now?

Lately I haven't had much time updating all my blogs because I'm currently dealing with some LIDAR data for my graduate thesis. I am quite bittered of the fact that really, this is more of a programming than analysis thing. I am more burdened of learning a new programming language or more so 'languages' just to be able to get meaningful results out of my raw data.

I have been inspired by another blog to just chronicle all my findings and queries so I can help others and so others may be able to help me too.

Also I guess this would be a good documentation for my future write ups for my methodologies.

So to start with. I am using a HDL-32E Velodyne LIDAR. It's a laser rangefinder that uses 32 lasers cupped inside a TIN-CAN-SIZED enclosure that's set to generate approximately 700,000 points per second. I have actually found quite a number of resources even codes about Velodyne LIDAR but it's on the HDL-64E and so porting the codes to the 32E is quite dangerous if I don't get much of the research on their differences.

So based on the manuals of both the Velodyne HDL-64 and 32E, I have tabulated the differences.

Velodyne 64E Velodyne 32E
Block of lasers 2 blocks of 32 diodes each 1 block of 32 laser detectors
Vertical field of view 26.8 degrees 41.3 degrees
spin rate 15Hz 10Hz
usable range 0.09m-120m 0.05-100m
no. of points captured per second 1 Million 700,000
horizontal FOV 360 degrees 360 degrees

Now I think I just have to take note of these facts when explaining how MASSIVE the point cloud generated by the raw PCAP data from the Velodyne 32E LIDAR would be.

  • Each laser fired on clock runs 1.152 microseconds cycle time
  • There would be 40 cycle times per 32 firings( 1 fire/laser) with the remaining 8 cycles(~9.216microseconds) as dead time for recharging.
  • To fire all the 32 lasers, 46.08 microseconds is needed
  • per packet there are 12x32 shots so approximately 552.96 microseconds
  • Hence there would be, 1808 packets streamed per second ~ 694,292 shots per second.  
So yeah, that would mean lots of bytes of space and memory would be eating up my pc to get my data analysed afterwards.

Now for my test fieldwork, we tried traversing commonwealth avenue. I used GPS Visualizer to convert my raw csv file tin GPX format or you may also opt to save it in kml (google earth./ google maps) or other format. Just be careful of the headers for the most important word is 'latitude' and 'longitude' so better input such or you'll fail to convert your files.

This is approximately 7 kms of data captured at 10-30kph speed of the vehicle. The pcap file was like 1.3gig and when I tried converting pcap to csv file through a code running in python, it was a woofing 16gig of data composed of 7000+ csv files. That means A LOT!

Now visualizing the results, I need to timestamp the data in pcap with that of the gps file then later do some resampling to minimize the number of points. I am done with the timestamping (which took me a week, I'm a bad programmer!) now I have to deal with the resampling to be able to get the mimimum no of points loaded in my point cloud viewer or most importantly arcgis.

Now what? I am like staring at my data for 2 days now. What's next?