4.24.2012

Free and Open Data Reliability : The case of Open Street Map road data vs "PAID" data sets

This is the very typical example why the people in our country should free our data!


Shown above is a map of my road data for a hydrologic model. Two unaligned road networks are visible from the image above. The red lines are my initial datasets which are actually "PAID" dataset from a government agency. When I did an overlay with a pan-sharpened satellite image, I got a large shift from the actual road network visible in the image itself. My initial notion is to georeference the image so that it would fit the readily available road data. However, when I tried overlaying the location of my mapped inlets, I noticed that the storm drain network which is aligned with the road coincides with that of my backdrop image. These inlet locations are in-situ observations gathered from a week of fieldwork hence there must've been a BIG problem with my road dataset. My next solution is to look for a free dataset. I have tried using Open Street Map data and alas! OSM data has a better fit! The yellow road network above which tends to coincide with the blue network of dots is actually the road data from Open Street Map. See how great and more reliable those FREE and OPEN datasets are?

Of course there may be some risks in using these free datasets but for my purpose, the OSM data obviously provides a better accuracy as compared to the "PAID" datasets that we get which are highly erroneous and on the first place.. shouldn't be PRICED at all!

I think our government should somehow consider this example. There's no harm really in opening our data to the public, the limitations should by far be set properly though.

With free and open data we get a WIDER COMMUNITY to do the refining for us. The blunders are fixed through time and we get something that's FREE and USEFUL and MORE RELIABLE for our ancillary datasets that would soon build a a more seamless layer overlay and analysis in our own geographic information system.

4.23.2012

How to choose a course for college and get a career after graduation



A few weeks ago, I was asked to give a short talk to a few young people about some wisdom on choosing their college course or future career. Since I haven't gone into the industry and had much of my time spent on research, I preferred taking the college course guidance part and leaving the industry and career after graduation part to my husband who has gone farther than me in terms of employment and experience of course. 


So for me, I made this analogy: Choosing your college course is almost the same as choosing your next faithful pair of shoes!
  1. Your choice of shoes should fit your budget and so is your choice of college course. - Have a reality check. You might be wanting the most expensive shoes and yet you know you can't afford it just yet. You may try your way by working harder while studying to earn your dream degree and yet end up not finishing it because of deeper financial constraints. So you better be real earlier or end up in regret of an unfinished dream.
  2. You will be the one to wear it so you yourself should love it. - Be passion driven. Know what you love doing the most and start from there. Your hobby may not necessarily bring you more money but later on in life, when you get to choose your respective jobs, you wouldn't want to get burned out because of a job that pushes you to death simply because you don't find enjoyment in doing it but you just want to earn big bucks for it. Mind you, loosing a job is more bearable than loosing your smile. 
  3. You will invest money, time and effort and choosing it so you better get the best pair that will fit your needs. - What are your priorities in choosing a course or career? Do you want to earn more money or do you want to improve your skills and doesn't really care if you earn or not afterwards? You set your standards and choose the course that will best fit such yardsticks. Choosing an inexpensive career from a vocational school on being a "barista/ bar tender" and then getting into a well-known hotel or cafe is more rewarding than finishing an expensive nursing degree ending up as a call-center agent after graduation.  
  4. You will use it for long so it must be sturdy and can withstand the test of time. Make sure you buy a tried and tested brand- Do your homework and research for the best institution offering your career of choice. The school may not be the real determinant of your success but it will surely pave you the way. Just as you may be needing or learning far more than what you've learned from the four corners of your classroom when you get into the industry, graduates from the best schools always have their edge.
  5. You wouldn't want to have so many other people wearing the same shoes as yours. - Don't take a course just because it's IN. Four or five years from now, that employment boom today may no longer be that in-demand as it is. You wouldn't want to compete with others in terms of employment afterwards or worse, you wouldn't want to be underemployed afterwards and not having the chance to practice your honed skills simply because there are no enough jobs for all of you graduates of that course. If you can, I fervently hope you will... please don't ride at the BANDWAGON of unemployed/ underemployed after graduation. 
  6. When everything else fails and you feel like you ain't got a choice at all, TRUST THAT GOD WILL GIVE YOU THE BEST PAIR! - I have my own gauge of divine intervention. If I got an opportunity smoothly, almost without lifting my fingers, I know it's God's will. Do your best in choosing your career but make sure  you keep the balance between all aspects - social, physical, intellectual and spiritual most importantly. I was once a stranger to this Geomatics field. I never even knew such course exist but I knew God placed me in here to make a difference. It's the divine intervention that made me push through and eventually love the field that I am into right now.  I would always think that GOD GAVE ME THIS and HE has a plan for me... of that I'm SURE! Of that I'm SURE!
And one last word and I think this is as important as your choice of career. Remember that the moment you get there ...    "Whatsoever thy hand findeth to do, do it with thy might." (Ecclesiastes 9:10)

Always give your best shot and develop into the dark room where you're processed. Plan your flight path, map your future and build a GIS of your dreams. #graduatesROCK-ON! 










4.12.2012

Why such a big deal with the rocket launch in North Korea?

North Korea's much hyped rocket launch has failed today, April 13, 2012. For the past few days, or even weeks, there have been much-a-do-and-say about such "overrated" launch. Most countries, especially the first world, are having debates on such move by NoKor.

Reuters reports:
"North Korea said it wanted the Unha-3 rocket to put a weather satellite into orbit, although critics believed it was designed to enhance its capacity to design a ballistic missile to deliver a nuclear warhead capable of hitting the continental United States."
So, is that it? Is North Korea already that powerful to wage war against those first world countries?

The paranoia has reached countries within the flight-path of such North Korean rocket including the Philippines. And since we are not capable of shooting such debris from the sky, removing it from its trajectory and totally crashing the rocket parts as it enters  the Philippine area of  responsibility, we were just advised to refrain from going out on some specific time of the day when the rocket would supposedly pass by our country. What do you think?

I would often wonder how the people could've  reacted to the Japan's launch of Spy satellites in 2003? Or how about the 4000++ satellites launched since 1998?

Wisegeek also says:
 "There are approximately 3,000 satellites operating in Earth orbit, according to the US National Aeronautics and Space Administration (NASA), out of roughly 8,000 man-made objects in total. In its entire history, the SSN has tracked more than 24,500 space objects orbiting Earth. The majority of these have fallen into unstable orbits and incinerated during reentry. The SSN also keeps track which piece of space junk belongs to which country.
As of 2008, the former Soviet Union and Russia had nearly 1,400 satellites in orbit, the USA about 1,000, Japan more than 100, China about 80, France over 40, India more than 30, Germany almost 30, the UK and Canada 25, and at least ten each from Italy, Australia, Indonesia, Brazil, Sweden, Luxembourg, Argentina, Saudi Arabia, and South Korea. "
Spacenews even reported some 10 year forecast that:
"According to the annual forecast of global launch activity the consulting firm released Aug. 25, an estimated 1,145 satellites — worth $196 billion worldwide — will be built between 2011 and 2020. About 70 percent of the satellites can be attributed to government demand, Euroconsult said."
Those satellites, just like that of South Korea, were of course rocket launched. But why make an argument on a single-failed attempt to stop a satellite launch just now?  Just so it may have been disguised as a ballistic missile test? But what about those countries who have just launched their own rockets some few years back?Why make much a fuss about a weather satellite from a relatively newcomer in the space-technology-era?

Since retrieving a broken satellite from space is more expensive than launching it, the satellite-space junk is continuously increasing through time thus creating a SPACE POLLUTION so to speak. Just as there are already a number of operational and non-operational satellites and space debris on the earth's orbit, these seemingly-fictional space junk image must've been really turning into a reality.


 Just as our sky's NO LONGER the limit anymore and there are already thousands and tons of space junks already in orbit on our precious mother earth, why trouble ourselves much with the worry of a fail attempt to launch a "weather" satellite JUST NOW? Maybe the delayed reactions are already too late, or maybe too, it may just be in time.

4.02.2012

Off-the-fieldnotes: On fieldworks using Handheld receivers

Last week was one tedious week, part of my primary data gathering is to map all the storm drains within my study area for my thesis. That was about 371 hectare area of land with a mix of residential, commercial and industrial land uses. I have used 2 types of Garmin Handheld receivers for this part, the Garmin Oregon 550 and the Garmin 76CSx for this storm drain inventory and mapping field work.


I would say that both are pretty good in terms of accuracy for a handheld receivers with a purpose of mapping  features just for reconnaissance or coarse accuracy requirements. Here are my rants/raves two cents for fieldworks utilizing handheld GPS receivers i.e. GARMIN GPSmap 76CSx and Oregon 550.
  • Do understand the primary purpose or objective of your data gathering and use the right tool for such purpose.-  You can't use a handheld GPS receiver to plot the corners of your lot and use it as an evidence for a land grabbing case. Have a clear understanding of the capabilities of the GPS device that you are using. The handheld GPS are of  the lower accuracy type with an accuracy buffer of 10-30m so if you are, for example, trying to map all the trees within the neighborhood or for my purpose map all the storm drains to locate inlet locations for my flood simulation, the accuracy of a handheld receiver is already sufficient. 
  • Use Alkaline batteries if rechargeable NiMH batteries are not available. - As you may know, GPS devices are power hogs. This is the major reason why your iPhone drains faster than usual when your location services that utilizes GPS are turned ON. So for your purpose of mapping, if you've got one large area to cover, better make sure you have lots of spare batteries at hand. Based on my experience, the Alkaline batteries (Energizer) lasted for 8 straight hours for the Garmin GPSmap 76Csx while it only lasted for almost 4 hours for Oregon 550. The longer performance of the 76CSx would be understandable since it doesn't have a camera that uses up much power unlike that of the Oregon 550. On my first day, I have used Eveready but to my dismay, it only lasted for 2 straight hours of mapping so I have to readily change batteries which is very much of a hassle. NiMH are also good choices for it lasts about 5 hours for the Oregon 550 but you still have to bring spares for longer mapping works.
  • Make sure to immediately save all your data gathered when you come back at the office. - This is my HARDLY LEARNED lesson . I have tried Oregon 550 first and since it has a support for memoru expansion, I didn't have worries for memory shortage. To  my dismay, the GARMIN 76CSx can only save up to 1000 waypoints. I am an hour before my last 3 kilometer stretch for the week-long fieldwork and I already ran out of waypoint memory. Too bad, I can't delete the previously gathered points :(. So unless you have a laptop onfield to readily save your data, better make sure to have it backed-up and deleted on the GPS receiver memory to allot more space for your recent observation points. For me, I have installed the MapSource program to receive and upload data from my computer to the GPS device. It has an option to save your data in GARMIN database, GPX (GPS exchange format), and TXT format for further processing :)
  • Plan your route ahead. There is indeed no better substitute to a carefully planned fieldwork. It would definitely save you much of your resources- time, money and effort. Just as there may be slight deviations from the original plan, plannning ahead is still the best way to commence every field data gathering exercise.
  • Lastly and Again know your device before going on field! - Part of the carefully planned fieldwork is a thorough knowledge on operating the device that you are going to use. Familiarity is the key. For user-friendliness, I like the Oregon 550 more because it's already in touch-screen interface as compared to the 76CSx which is still operated by button. I never really had a hard time getting used to Oregon 550 but the 76CSx is a whole new different story. I don't know if it's just me and my stupid nature that took me an ample 2 hours to figure out how to mark waypoints using the GARMIN 76CSx. So for the record, HOW TO MARK A WAYPOINT USING GARMIN 76CSx, Just long press the ENTER button and there you have it! (^_^)  The deleting of waypoints and tracks are also harder ones or I guess less intuitively designed ones. You can figure it out by pressing the menus over and over again. So yeah, FAMILIARITY IS THE KEY! So don't hesitate to explore your devices first or you'll have the entire day out on field exploring on how to use it...
  • Oh no, beforeI forget...In case you ran out of space for waypoints in your Handheld GPS and you still got a phone powered by GPS (that's not yet on a low battery!) ... you may opt to use it for mapping purposes too! I seemed to have been really frustrated when I ran out of waypoint memory and forgot to use my HTC Mozart which is also capable of geotagging photos. So yeah, for my next field mapping work, I would be very much reminded of using my usual buddy for my location-based needs:) Since our modern smartphones nowadays are already GPS powered, it works pretty much like our handheld receivers with Assisted GPS (AGPS) capability. This AGPS utilizes the location of the nearest cellsite to easily "assist" fix your position. Very nice indeed. This is a photo taken from my fieldwork for which I have turned on the location option everytime I take photos. When I tried geotagging it in Picasa, it appeared to be about 10m off the original location as specified by the Handheld GPS which is fairly ok for mapping purposes. 


Just are there are no fast rules, I hope this helps and I'm sure I'm getting more of these hardly-learned lessons that I will gladly share and chronicle here in my blog in the days to come...just so until I'm done with my data gathering part of my thesis  #fieldworkIsMoreFun (^_^)

3.08.2012

Back-sighting the history of Surveying

Last November 28- December 2, 2011, I had the chance to participate in a course for GPS data analysis and modelling (GDAM) for scientific and practical application workshop held at the National Engineering Center of UP Diliman.

Part of the itinerary is a day of field trip to the major research agencies that uses Global Positioning Systems (GPS) in the modelling and analysis of crustal movements for their respective scientific applications.

The agencies that we have visited are the Philippine Institute of Volcanology and Seismology (PHIVOLCS),  Manila Observatory and the National Mapping and Resource Information Authority (NAMRIA).

At PHIVOLCS we get a peek on how the GPS data from continuous and campaign sites have been archived, processed and delivered. These data will then be used further for the analysis of crustal movements specifically along the Philippune Fault as well as the deformation and movements on active volcanoes in the Philippines such as Mayon and Kanlaon. Having been part of the PHIVOLCS-GPS Team for quite some time, I am already familiar with the processing scheme using Bernese 5.0 which is Windows-based. In this workshop however, we are taught on how to process and model GPS data using  GAMIT-GLOBK and DEFNODE respectively. GAMIT-GLOBK and DEFNODE both runs in LINUX. 

Another office that we have visited is the Manila Observatory where an IGS station -PIMO is located.
A short introductory GPS presentation was given during our visit.




Of course we also got sight of the only International GPS Service (IGS) station in the Philippines for the longest time since 1980's I guess. That is PIMO appearing in the photo below. I think NAMRIA has recently added another IGS station, PTAG which is located at their office in Taguig. So in total, there are already 2 IGS stations that's tied into the International Terrestrial Reference Frame (ITRF) in the Philippines to account for the movement of our 7107 islands that's being tectonically active and bounded by four major tectoic plates.

 After the Manila Observatory, our next stop is NAMRIA where the Data processing and archiving of the Philippine Active Geodetic Network (PAGEnet) takes place.



After the lecture and introduction of the data monitoring procedures done in the PAGEnet Data and Control Center, we got the chance to tour around the museum located at ground floor of NAMRIA. The surveying instruments of the past were carefully preserved and displayed within the room.

Berger Theodolite

Transit

Wye Level

Trimble 4000 GPS

 Trimble 4000 SLD receiver


Wild-T2 Theodolite

JMR Doppler Receiver

India Magnetometer

Gravimeter

Three arm protractor

Sextant

A step through the relief map of the Philippines.

 A walk through the rich history of Surveying and Mapping in the Philippines


 A back-sight of the accuracy and precision instruments for surveying in the past...


Foresighting...

Cam-whoring (^_^)



3.01.2012

Woah! Win up to $1M by hacking Google Chrome!

It was of great awe that I discover this morning how Google Chrome developers have just been so confident of the security of their product that they really got some pretty great awards for those who could break in their security.


Hack Chrome! Get Rewards! Nice nice!


This year again CanSecWest is offering up to ONE MILLION DOLLAR reward for geeks who could crack the Google Chrome codes and bring themselves to the pedestal of their dreams. Yeah, who knows? It might just be another entrepot for such hidden talents that really ought to be brought out of the underground dark.


So here are the compelling prizes to drool so much about:
$60,000 - “Full Chrome exploit”: Chrome / Win7 local OS user account persistence using only bugs in Chrome itself. 
$40,000 - “Partial Chrome exploit”: Chrome / Win7 local OS user account persistence using at least one bug in Chrome itself, plus other bugs. For example, a WebKit bug combined with a Windows sandbox bug.
$20,000 - “Consolation reward, Flash / Windows / other”: Chrome / Win7 local OS user account persistence that does not use bugs in Chrome. For example, bugs in one or more of Flash, Windows or a driver. These exploits are not specific to Chrome and will be a threat to users of any web browser. Although not specifically Chrome’s issue, we’ve decided to offer consolation prizes because these findings still help us toward our mission of making the entire web safer. 
All winners will also receive a Chromebook.
They will issue multiple rewards per category, up to the $1 million limit, on a first-come-first served basis. There is no splitting of winnings or “winner takes all.” They require each set of exploit bugs to be reliable, fully functional end to end, disjoint, of critical impact, present in the latest versions and genuinely “0-day,” i.e. not known to us or previously shared with third parties. Contestant’s exploits must be submitted to and judged by Google before being submitted anywhere else. 
Yey! The bribe is not that bad, so you playful geeks could now start the ball rolling! May fickle fate find you fierce fellows! (^_^)


Source: Chromium Blog

2.14.2012

GIS Technology Plays Important Role to Map Disease and Health Trends

AURORA, Colo. – February 14, 2012 – Thanks to the advancements in geographic information systems (GIS) technologies and mapping applications like ArcGIS, health organizations worldwide are mapping disease and sickness trends in an effort to treat them locally and globally.

GIS tools and ArcGIS mapping applications play an important role in developing data-driven solutions that help health organizations visualize, analyze, interpret and present complex geo-location data.

The World Health Organization maintains an updated influenza map that shows Asia and Africa are at greater risk the spread of flu. Other organizations such as Health-mapping.com keep up-to-the-minute data-filled maps that cover water and health, influenza and malaria.

One map keeps water-related infectious diseases in the WHO European Region, focusing on the visualization of pan-European and worldwide water-related disease data that comes from centralized information system for infectious diseases (CISID) database. The map covers HIV/AIDS, sexually transmitted diseases, Tuberculosis, Diphtheria and several other diseases.

Then there’s HealthMap.org, which was founded in 2006 to use online sources to help with disease outbreak monitoring.

Created by epidemiologists and software developers at Children’s Hospital Boston, the freely available website and mobile app ‘Outbreaks Near Me’ deliver real-time intelligence on a broad range of emerging infectious diseases for a diverse audience including libraries, local health departments, governments and international travelers.

HealthMap brings together disparate data sources, including online news aggregators, eyewitness reports, expert-curated discussions and validated official reports, to achieve a unified and comprehensive view of the current global state of infectious diseases and their effect on human and animal health.

Through an automated process, updating 24/7/365, the system monitors, organizes, integrates, filters, visualizes and disseminates online information about emerging diseases in nine languages, facilitating early detection of global public health threats.

For example, HealthMap.org recently released a report that several Massachusetts swans tested positive for low-path avian influenza or bird flu. Although the report indicated there is no threat to human health, this latest finding is just an example of how GIS can help save lives in the case of an outbreak.

“The real focus is identifying and focusing surveillance in hotspots around the world where we have potential for risk of a new disease that potentially might cause a pandemic worldwide,” said John Brownstein, co-founder of HealthMap.org and assistant professor of pediatrics at Harvard Medical School.

Maps are also tracking obesity and diabetes. CDC data and mapping indicates 644 counties in 15 states represent most of the country’s type two diabetes cases. This has been called the “Diabetes Belt,” which spans from Appalachia into the Deep South. Data also shows that a few counties in Michigan also have higher rates, as well as some regions in the West.

Meanwhile, Esri, an international geographic technology firm whose software is used by more than 350,000 organizations worldwide, has a GIS for Health & Human Services division that helps public health organizations and hospitals alike. Hospitals use ArcGIS for accurate and relevant patient information as well as for marketing, planning and community relations.

For example, the University of Kentucky Trauma Center, called UK Chandler Hospital used a customized GIS application to boost analyze data.

“We built the custom ArcGIS Server application using the Flex API to maximize accessibility and ease of use,” said Chris Walls, cofounder of 39°N, the firm that built the platform for UK. “We are extremely proud of this cutting-edge collaboration with the University of Kentucky. This kind of application will significantly streamline the administration of public facilities.”

GIS technology and ArcGIS offers tremendous potential to benefit the health care industry and its many uses are just now beginning to be realized. The need for GIS professionals who are proficient using ArcGIS will be even greater as organizations develop innovative ways to harness the data integration and spatial visualization power of GIS.

“The philosophy of American Sentinel’s bachelor’s in GIS curriculum is to prepare students for real workplace issues and challenges using state-of-the-art, industry leading software products such as Esri’s ArcGIS and open source technologies,” says Devon Cancilla, Ph.D., dean, business and technology at American Sentinel University.

Dr. Cancilla believes it’s important to empower students with a current and applicable GIS skill set such as using ArcGIS, so students can apply relevant knowledge on the job while use real-improving their career trajectory.

Learn More About American Sentinel University’s GIS and Health Care Degrees
American Sentinel’s programs prepare students for entry into the GIS field and provide training in information systems that benefit other business areas. Learn more about American Sentinel University’s GIS degrees athttp://www.americansentinel.edu/online-degree/bachelor-degree-online/bachelor-gis-degree.php.

About American Sentinel University
American Sentinel University delivers the competitive advantages of accredited Associate, Bachelor's and Master's online degree programs focused on the needs of high-growth sectors, including information technology, computer science, GIS, computer information systems and business intelligence degrees. The university is accredited by the Distance Education and Training Council (DETC), which is listed by the U.S. Department of Education as a nationally recognized accrediting agency and is a recognized member of the Council for Higher Education Accreditation.

1.29.2012

How to input LIDAR data in ArcGIS

While I have been so problematic before how to resample my billions of points to have it visualized in 3D and further perform analysis over it, I found out that ArcGIS would be the best solution.

The usual discretization is performed on my data, I scraped out all the noisy points by limiting my range to 5cm o 100m. Now I added another criterion, that is to remove all points below my LIDAR setup. That is to consider theta(horizontal FOV)<90 and theta >270 degrees. That in a way reduced my point cloud into almost 30% of the original. Memory-wise, it was a LUXURY already!

Now for checking the irregularities in the observed point cloud, using the Point File Information Tool is by far the best solution. Using ArcGIS 10.0:

1. Go to 3D analayst tools>Conversion>From File> Point File Information.
2. You will be opt to browse lidar point cloud by File or by Folder. (I suggest that you browse by Folder if you've got more than 10 files.)
3. Now you will be opt for the file format. Since what I did is to manually derive the point cloud from scratch and most of my intensity returns are not so usable due to our observation time at night, I opted using XYZ format instead. The binary LAS format is the mainstream interchange format which is optimized and I guess what's available from available vendors out there. Anyways..
4. Specify the suffix (file extension)  and Coordinate System in case you have any.

 Now I have tested using just the first 200 rotations for checking my point file information.


What the process will give you are the bounding rectangle, the attribute table and the average point spacing contained within one of the attributes. This average point spacing is crucial when you are to load and visualize the point cloud later.

So right clicking the Pt_Spacing field in the attribute table, check the statistics to get the average (MEAN) point spacing.


Now you get to clean your data by looking at the average point spacing, any file that goes too large or too small may be erroneous due to incorrect sampling hence can already be deleted. It's also wise to check on the bounding rectangles to check for inconsistencies.

Next will be the loading of the LIDAR data.

My experience is on the ASCII format xyz data. Here are the steps.
1. > 3D Analyst Tools>Conversion>Ascii 3D to feature class. (Input average point spacing as specified in the Point File Information results. Specify coordinate system, input file format etc. if there's any)
2.  To create a DEM out of your Multipoints, just use >Conversion Tools>To Raster>Points to Raster
      - For the Value Field, use Shape.Z
     -  For the cell assignment type, use MEAN
     -  For the cellsize, use average point spacing *4
3. Export data to GRID or Tif to make your data permanent.

Ok. so now, for my next problem, I found out that my latitude and longitude are being interchanged in plotting. This is quite easy to solve in Matlab through:

M(:,[1, 2, 3])=M(:,[2, 1, 3]);


The matrix M contains columns 1,2,3 as lat,long,ht respectively, what I did is to interchange the columns to plot my cloud of points seamlessly in ArcGIS(ArcMap and ArcScene). Nice results so far.

Here's my sample 200 rotations, meaning 20 second Velodyne LIDAR data. The red triangles are points observed through GPS as traversed by the lidar.



Now my next step is to test the data in a larger scale, I mean for my entire dataset. Let's see what kind of problem will arise next...

1.26.2012

Some Matlab scripting and coordinate axes glitches on my codes

I had some problems using my previous Matlab script on generating the converted latitude, longitude and height from the ECEF -X,Y,Z coordinates.

for k = 1:5
    matFilename = sprintf('%d.csv', k);
    M = ecef2lla(textread(matFilename));
    dlmwrite('sample.xyz',M, '-append');
end



What happens right here is that it writes formatted values of the latitude, longitude and height formatted in three columsn but in 5 significant digits only hence creating duplicated coordinates which scraped up most of my values in the sample.xyz. It took me the whole long weekend to figure out how to improve the results, I know it's on the formatting only and  not on the predefined classes per se. 


As I've always been familiar with the %3.2f tag for specifying the decimal places in a float, I have been trying all the way to fit in my argument but to no avail. I'm almost giving up on using the "dlmwrite" function and tried using the "fprintf" which accepts the %X.Xf argument, however the matrices that used to be formatted in three columns are turning out to be  printed in a single queue. So what now? I think I'd be having a hard time if I'd re-read and parse my final single-columned result to make it appear like three-columned file. So I tried going back to the 'dlmwite()' function. This forced me to read the entire dlmwrite code and Alas! I found out the 'precision' argument...ahhah! Now, my script is working perfectly just the way I want it.



for k = 1:500
    matFilename = sprintf('%d.csv', k);
    M = ecef2lla(textread(matFilename));
    dlmwrite('sample.xyz',M, '-append', 'precision',10);
end


This simple Matlab code reads all the *.csv files within the folder where the matlab code was saved, applies the conversion from the earth-centered-earth-fixed X, Y,Z values through the ecef2lla function to get the latitude, longitude and height above the WGS-84 ellipsoid of the equivalent points then writes the result in the 'sample.xyz' text file. The converted coordinates for every new filename was appended to the matrix of coordinates. Now solving my previous formating problem, I specified the 'precision' argument then the value '10' to write a the formatted float values to 10 significant figures. Yey!


Now my next step is to plot my generated cloud of points in ArcGIS. And luckily, I have another problem to figure out, I did something wrong with the orientation of my velodyne LIDAR points with that of the ECEF coordinates.


Here's how it appeared in plan view using ArcMap. The orientation of the axis has been messed up. 


And here's how it looks in perspective using ArcScene.

 Panning and rotating the views, the real orientation should have been this way.






Now I need to try all the possible permutations of the X,Y,Z... that would only be six so I am now up to my trial and error mode. Hopefully, I'd finish this and my next move is to port my Matlab ecef2lla function to python so I won't be cluttering all my processing steps. ;)

1.20.2012

Minimizing my lidar velodyne point cloud

I think I should be reminding myself how I did the timestamping of the point cloud data with that of the GPS.
Here are the seemingly easy steps.

  1. I converted the GPS data from geographic lat, long, ht to an ECEF(Earth-centered-earth-fixed) cartesian coordinate system using this formula: The reason for conversion into cartesian is so that I can easily do the translation from our velodyne device to the GPS antenna. I used the WGS84 reference ellipsoid parameters for conversion. a = 6,378,137 m , 1/f= 6,356,752.3142
  2.  Now after I got all my coordinates in X, Y, Z, I matched the timestamps in the LIDAR packets as that of the timestamp provided by the GPS.
  3. I then performed some translation based on the structure of the vehicle mount that we have fabricated for the fieldwork.
  4. Now my data is in X,Y,Z and ready for plotting in a point cloud software.
When I'm finally down with the timestamping of the LIDAR with GPS, I am now overwhelmed by the number of points generated from pcap to csv. For every rotation, a csv file was generated. Since it spins at 10Hz, there was like, 10 files per second. I decided to just scrape off the other 9 and use every 10th csv generated.

Now for some sort of discretization, I performed some thresholding of my LIDAR to point values to the specified usable range, 5cm to 100m.

I am now left with approximately 20% of my original data. Seemed like it's gonna work this time.

The next problem is how to fit my data with the baseline data and overlay them in the same coordinate system in my mapping software?

I have tried creating a Matlab script that would read my csv files with coordinates in ECEF-XYZ coordinate system and convert it to latitude longitude again. All the matrices of converted coordinates are appended in a single XYZ file for plotting later.

Here's the matlab script that I've created


for k = 1:5
    matFilename = sprintf('%d.csv', k);
    M = ecef2lla(textread(matFilename));
    dlmwrite('sample.xyz',M, '-append');
end
That's it, I never thought that could've been so simple. Thanks to the predefined classes and functions in Matlab. I think, I would be using it more often.:)

Now I will just try to sort my data and remove some redundant ones. So what's next? Tomorrow, I'll have to try the fitting of the DEM(digiral elevation models). My next task is how to convert my topo maps in Autocad dwg format to contours to create a DEM. I think this is pretty much easier. I think, I hope...Let's see.(^_^)

1.19.2012

What's keeping me so busy right now?

Lately I haven't had much time updating all my blogs because I'm currently dealing with some LIDAR data for my graduate thesis. I am quite bittered of the fact that really, this is more of a programming than analysis thing. I am more burdened of learning a new programming language or more so 'languages' just to be able to get meaningful results out of my raw data.

I have been inspired by another blog to just chronicle all my findings and queries so I can help others and so others may be able to help me too.

Also I guess this would be a good documentation for my future write ups for my methodologies.

So to start with. I am using a HDL-32E Velodyne LIDAR. It's a laser rangefinder that uses 32 lasers cupped inside a TIN-CAN-SIZED enclosure that's set to generate approximately 700,000 points per second. I have actually found quite a number of resources even codes about Velodyne LIDAR but it's on the HDL-64E and so porting the codes to the 32E is quite dangerous if I don't get much of the research on their differences.

So based on the manuals of both the Velodyne HDL-64 and 32E, I have tabulated the differences.


Velodyne 64E Velodyne 32E
Block of lasers 2 blocks of 32 diodes each 1 block of 32 laser detectors
Vertical field of view 26.8 degrees 41.3 degrees
spin rate 15Hz 10Hz
usable range 0.09m-120m 0.05-100m
no. of points captured per second 1 Million 700,000
horizontal FOV 360 degrees 360 degrees

Now I think I just have to take note of these facts when explaining how MASSIVE the point cloud generated by the raw PCAP data from the Velodyne 32E LIDAR would be.


  • Each laser fired on clock runs 1.152 microseconds cycle time
  • There would be 40 cycle times per 32 firings( 1 fire/laser) with the remaining 8 cycles(~9.216microseconds) as dead time for recharging.
  • To fire all the 32 lasers, 46.08 microseconds is needed
  • per packet there are 12x32 shots so approximately 552.96 microseconds
  • Hence there would be, 1808 packets streamed per second ~ 694,292 shots per second.  
So yeah, that would mean lots of bytes of space and memory would be eating up my pc to get my data analysed afterwards.

Now for my test fieldwork, we tried traversing commonwealth avenue. I used GPS Visualizer to convert my raw csv file tin GPX format or you may also opt to save it in kml (google earth./ google maps) or other format. Just be careful of the headers for the most important word is 'latitude' and 'longitude' so better input such or you'll fail to convert your files.

This is approximately 7 kms of data captured at 10-30kph speed of the vehicle. The pcap file was like 1.3gig and when I tried converting pcap to csv file through a code running in python, it was a woofing 16gig of data composed of 7000+ csv files. That means A LOT!

Now visualizing the results, I need to timestamp the data in pcap with that of the gps file then later do some resampling to minimize the number of points. I am done with the timestamping (which took me a week, I'm a bad programmer!) now I have to deal with the resampling to be able to get the mimimum no of points loaded in my point cloud viewer or most importantly arcgis.

Now what? I am like staring at my data for 2 days now. What's next?