What's keeping me so busy right now?

Lately I haven't had much time updating all my blogs because I'm currently dealing with some LIDAR data for my graduate thesis. I am quite bittered of the fact that really, this is more of a programming than analysis thing. I am more burdened of learning a new programming language or more so 'languages' just to be able to get meaningful results out of my raw data.

I have been inspired by another blog to just chronicle all my findings and queries so I can help others and so others may be able to help me too.

Also I guess this would be a good documentation for my future write ups for my methodologies.

So to start with. I am using a HDL-32E Velodyne LIDAR. It's a laser rangefinder that uses 32 lasers cupped inside a TIN-CAN-SIZED enclosure that's set to generate approximately 700,000 points per second. I have actually found quite a number of resources even codes about Velodyne LIDAR but it's on the HDL-64E and so porting the codes to the 32E is quite dangerous if I don't get much of the research on their differences.

So based on the manuals of both the Velodyne HDL-64 and 32E, I have tabulated the differences.


Velodyne 64E Velodyne 32E
Block of lasers 2 blocks of 32 diodes each 1 block of 32 laser detectors
Vertical field of view 26.8 degrees 41.3 degrees
spin rate 15Hz 10Hz
usable range 0.09m-120m 0.05-100m
no. of points captured per second 1 Million 700,000
horizontal FOV 360 degrees 360 degrees

Now I think I just have to take note of these facts when explaining how MASSIVE the point cloud generated by the raw PCAP data from the Velodyne 32E LIDAR would be.


  • Each laser fired on clock runs 1.152 microseconds cycle time
  • There would be 40 cycle times per 32 firings( 1 fire/laser) with the remaining 8 cycles(~9.216microseconds) as dead time for recharging.
  • To fire all the 32 lasers, 46.08 microseconds is needed
  • per packet there are 12x32 shots so approximately 552.96 microseconds
  • Hence there would be, 1808 packets streamed per second ~ 694,292 shots per second.  
So yeah, that would mean lots of bytes of space and memory would be eating up my pc to get my data analysed afterwards.

Now for my test fieldwork, we tried traversing commonwealth avenue. I used GPS Visualizer to convert my raw csv file tin GPX format or you may also opt to save it in kml (google earth./ google maps) or other format. Just be careful of the headers for the most important word is 'latitude' and 'longitude' so better input such or you'll fail to convert your files.

This is approximately 7 kms of data captured at 10-30kph speed of the vehicle. The pcap file was like 1.3gig and when I tried converting pcap to csv file through a code running in python, it was a woofing 16gig of data composed of 7000+ csv files. That means A LOT!

Now visualizing the results, I need to timestamp the data in pcap with that of the gps file then later do some resampling to minimize the number of points. I am done with the timestamping (which took me a week, I'm a bad programmer!) now I have to deal with the resampling to be able to get the mimimum no of points loaded in my point cloud viewer or most importantly arcgis.

Now what? I am like staring at my data for 2 days now. What's next?

Comments

Popular posts from this blog

Free Open Street Map data (i.e.shapefiles etc.) for the Philippines

Free Philippine Administrative Boundaries shapefile

University of the Philippines Diliman Campus