Posts

Showing posts from 2012

Free and Open Data Reliability : The case of Open Street Map road data vs "PAID" data sets

Image
This is the very typical example why the people in our country should free our data! Shown above is a map of my road data for a hydrologic model. Two unaligned road networks are visible from the image above. The red lines are my initial datasets which are actually "PAID" dataset from a government agency. When I did an overlay with a pan-sharpened satellite image, I got a large shift from the actual road network visible in the image itself. My initial notion is to georeference the image so that it would fit the readily available road data. However, when I tried overlaying the location of my mapped inlets, I noticed that the storm drain network which is aligned with the road coincides with that of my backdrop image. These inlet locations are in-situ observations gathered from a week of fieldwork hence there must've been a BIG problem with my road dataset. My next solution is to look for a free dataset. I have tried using Open Street Map data and alas! OSM data has a

How to choose a course for college and get a career after graduation

Image
A few weeks ago, I was asked to give a short talk to a few young people about some wisdom on choosing their college course or future career. Since I haven't gone into the industry and had much of my time spent on research, I preferred taking the college course guidance part and leaving the industry and career after graduation part to my husband who has gone farther than me in terms of employment and experience of course.  So for me, I made this analogy: Choosing your college course is almost the same as choosing your next faithful pair of shoes! Your choice of shoes should fit your budget and so is your choice of college course. - Have a reality check. You might be wanting the most expensive shoes and yet you know you can't afford it just yet. You may try your way by working harder while studying to earn your dream degree and yet end up not finishing it because of deeper financial constraints. So you better be real earlier or end up in regret of an unfinished dream. Y

Why such a big deal with the rocket launch in North Korea?

Image
North Korea's much hyped rocket launch has failed today, April 13, 2012. For the past few days, or even weeks, there have been much-a-do-and-say about such "overrated" launch. Most countries, especially the first world, are having debates on such move by NoKor. Reuters reports: "North Korea  said it wanted the Unha-3 rocket to put a weather satellite into orbit, although critics believed it was designed to enhance its capacity to design a ballistic missile to deliver a nuclear warhead capable of hitting the continental United States." So, is that it? Is North Korea already that powerful to wage war against those first world countries? The paranoia has reached countries within the flight-path of such North Korean rocket including the Philippines. And since we are not capable of shooting such debris from the sky, removing it from its trajectory and totally crashing the rocket parts as it enters  the Philippine area of  responsibility, we were just advised to

Off-the-fieldnotes: On fieldworks using Handheld receivers

Image
Last week was one tedious week, part of my primary data gathering is to map all the storm drains within my study area for my thesis. That was about 371 hectare area of land with a mix of residential, commercial and industrial land uses. I have used 2 types of Garmin Handheld receivers for this part, the Garmin Oregon 550 and the Garmin 76CSx for this storm drain inventory and mapping field work. I would say that both are pretty good in terms of accuracy for a handheld receivers with a purpose of mapping  features just for reconnaissance or coarse accuracy requirements. Here are my rants/raves two cents for fieldworks utilizing handheld GPS receivers i.e. GARMIN GPSmap 76CSx and Oregon 550. Do understand the primary purpose or objective of your data gathering and use the right tool for such purpose. -  You can't use a handheld GPS receiver to plot the corners of your lot and use it as an evidence for a land grabbing case. Have a clear understanding of the capabilities of the GP

Back-sighting the history of Surveying

Image
Last November 28- December 2, 2011, I had the chance to participate in a course for GPS data analysis and modelling (GDAM) for scientific and practical application workshop held at the National Engineering Center of UP Diliman. Part of the itinerary is a day of field trip to the major research agencies that uses Global Positioning Systems (GPS) in the modelling and analysis of crustal movements for their respective scientific applications. The agencies that we have visited are the Philippine Institute of Volcanology and Seismology (PHIVOLCS),  Manila Observatory and the National Mapping and Resource Information Authority (NAMRIA). At PHIVOLCS we get a peek on how the GPS data from continuous and campaign sites have been archived, processed and delivered. These data will then be used further for the analysis of crustal movements specifically along the Philippune Fault as well as the deformation and movements on active volcanoes in the Philippines such as Mayon and Kanlaon. Having bee

Woah! Win up to $1M by hacking Google Chrome!

It was of great awe that I discover this morning how Google Chrome developers have just been so confident of the security of their product that they really got some pretty great awards for those who could break in their security. Hack Chrome! Get Rewards! Nice nice! This year again CanSecWest is offering up to ONE MILLION DOLLAR reward for geeks who could crack the Google Chrome codes and bring themselves to the pedestal of their dreams. Yeah, who knows? It might just be another entrepot for such hidden talents that really ought to be brought out of the underground dark. So here are the compelling prizes to drool so much about: $60,000 - “Full Chrome exploit”: Chrome / Win7 local OS user account persistence using only bugs in Chrome itself.  $40,000 - “Partial Chrome exploit”: Chrome / Win7 local OS user account persistence using at least one bug in Chrome itself, plus other bugs. For example, a WebKit bug combined with a Windows sandbox bug. $20,000 - “Consolation rewar

GIS Technology Plays Important Role to Map Disease and Health Trends

AURORA, Colo. – February 14, 2012 – Thanks to the advancements in geographic information systems (GIS) technologies and mapping applications like ArcGIS, health organizations worldwide are mapping disease and sickness trends in an effort to treat them locally and globally. GIS tools and ArcGIS mapping applications play an important role in developing data-driven solutions that help health organizations visualize, analyze, interpret and present complex geo-location data. The World Health Organization maintains an updated influenza map that shows Asia and Africa are at greater risk the spread of flu. Other organizations such as Health-mapping.com keep up-to-the-minute data-filled maps that cover water and health, influenza and malaria. One map keeps water-related infectious diseases in the WHO European Region, focusing on the visualization of pan-European and worldwide water-related disease data that comes from centralized information system for infectious diseases (CISID) database

How to input LIDAR data in ArcGIS

Image
While I have been so problematic before how to resample my billions of points to have it visualized in 3D and further perform analysis over it, I found out that ArcGIS would be the best solution. The usual discretization is performed on my data, I scraped out all the noisy points by limiting my range to 5cm o 100m. Now I added another criterion, that is to remove all points below my LIDAR setup. That is to consider theta(horizontal FOV)<90 and theta >270 degrees. That in a way reduced my point cloud into almost 30% of the original. Memory-wise, it was a LUXURY already! Now for checking the irregularities in the observed point cloud, using the Point File Information Tool is by far the best solution. Using ArcGIS 10.0: 1. Go to 3D analayst tools>Conversion>From File> Point File Information. 2. You will be opt to browse lidar point cloud by File or by Folder. (I suggest that you browse by Folder if you've got more than 10 files.) 3. Now you will be opt for the f

Some Matlab scripting and coordinate axes glitches on my codes

Image
I had some problems using my previous Matlab script on generating the converted latitude, longitude and height from the ECEF -X,Y,Z coordinates. for k = 1:5     matFilename = sprintf('%d.csv', k);     M = ecef2lla(textread(matFilename));     dlmwrite('sample.xyz',M, '-append'); end What happens right here is that it writes formatted values of the latitude, longitude and height formatted in three columsn but in 5 significant digits only hence creating duplicated coordinates which scraped up most of my values in the sample.xyz. It took me the whole long weekend to figure out how to improve the results, I know it's on the formatting only and  not on the predefined classes per se.  As I've always been familiar with the %3.2f tag for specifying the decimal places in a float, I have been trying all the way to fit in my argument but to no avail. I'm almost giving up on using the "dlmwrite" function and tried using the "fprintf&q

Minimizing my lidar velodyne point cloud

Image
I think I should be reminding myself how I did the timestamping of the point cloud data with that of the GPS. Here are the seemingly easy steps. I converted the GPS data from geographic lat, long, ht to an ECEF(Earth-centered-earth-fixed) cartesian coordinate system using this formula: The reason for conversion into cartesian is so that I can easily do the translation from our velodyne device to the GPS antenna. I used the WGS84 reference ellipsoid parameters for conversion.   a  = 6,378,137 m , 1/f=  6,356,752.3142  Now after I got all my coordinates in X, Y, Z, I matched the timestamps in the LIDAR packets as that of the timestamp provided by the GPS. I then performed some translation based on the structure of the vehicle mount that we have fabricated for the fieldwork. Now my data is in X,Y,Z and ready for plotting in a point cloud software. When I'm finally down with the timestamping of the LIDAR with GPS, I am now overwhelmed by the number of points generated from pcap

What's keeping me so busy right now?

Image
Lately I haven't had much time updating all my blogs because I'm currently dealing with some LIDAR data for my graduate thesis. I am quite bittered of the fact that really, this is more of a programming than analysis thing. I am more burdened of learning a new programming language or more so 'languages' just to be able to get meaningful results out of my raw data. I have been inspired by another blog to just chronicle all my findings and queries so I can help others and so others may be able to help me too. Also I guess this would be a good documentation for my future write ups for my methodologies. So to start with. I am using a HDL-32E Velodyne LIDAR. It's a laser rangefinder that uses 32 lasers cupped inside a TIN-CAN-SIZED enclosure that's set to generate approximately 700,000 points per second. I have actually found quite a number of resources even codes about Velodyne LIDAR but it's on the HDL-64E and so porting the codes to the 32E is quite dang