Archive for the ‘Maps’ Category

Adding a Scale Bar in the QGIS Map Composer

Friday, August 31st, 2012

I recently finished revisions for the next edition of the manual for my GIS workshop, and incorporated a new section on adding a scale bar in the QGIS map composer. I thought I’d share that piece here. I’m using the latest version of QGIS, 1.8 Lisboa.

The scale bar in the map composer automatically takes the units used in the coordinate reference system (CRS) that your layers are in. In order to get a scale bar that makes sense, your layers will need to use a CRS that’s in meters or feet. If you’re using a geographic coordinate system that’s in degrees, like WGS 84 or NAD 83, the result isn’t going to make sense. It’s possible at large scales (covering small areas) to convert degrees to meters or feet but it’s a pain. Most projected coordinate systems for countries, continents, and the globe use meters. Regional systems like UTM also use meters, and in the US you can use a state plane system and choose between meters or feet. Transform your layers to something that’s appropriate for your map.

Once you have layers that are in a CRS that uses meters or feet, you’ll need to convert the units to kilometers or miles using the scale bar’s menu. Let’s say I have a map of the 50 states that’s in the US National Atlas Equal Area Projection, which is a continental projected coordinate system of the US, number 2163 in the EPSG library (this system is also known as the Lambert Azimuthal Equal-Area projection). It’s in meters. In the QGIS Map Composer I select the Add New Scalebar button, and click on the map to add it. The resulting scale bar is rather small and the units are clumped together.

Scalebar button on the toolbar

With the scale bar active, in the menus on the right I click on the Item Properties. First, we have to decide how large we want the individual segments on the scale bar to be. I’m making a thematic map of the US, so for my purposes 200 miles would be OK. The conversion factor is 1,609 meters in a mile. Since we want each segment on the scale bar to represent 200 miles, we multiply 1,609 by 200 to get 321,800 meters. In the Item tab, this is the number that I type in the segment size box: 321,800, replacing the default 100,000. Then, in the map units by bar, I change this from 1 to 1,609. Now the units of the scale bar start to make sense. Increase the number of right segments from 2 to 3, so I have a bar that goes from 0 to 600 miles, in 200 mile segments. I like to decrease the height of the bar a bit, from 5mm to 3mm. In the Unit label box I type Miles. Lastly, I switch to the General options tab just below, and turn off the outline for the bar. Now I have a scale bar that’s appropriate for this map!

Most people in the rest of the world will be using kilometers. That conversion is even simpler. If we wanted the scale bar segments to represent 200 km, we would multiply 200 by 1,000 (1,000 meters in a kilometer) to get 200,000 meters. 200,000 goes in the segment size box and 1,000 goes in the map units per bar. Back in the US, if you’re working in a local state plane projection that uses feet, the conversion will be the familiar 5,280 feet to a mile. So, if we wanted a scale bar that has segments of 20 miles, multiply 5,280 by 20 to get 105,600 feet. 105,600 goes in the segment size box, and 5,280 goes in the map units per bar box.

A reality check is always a good idea. Take your scale bar and compare it to some known distance between features on your map. In our first example, I would drag the scale bar over the map of the US and measure the width of Illinois at its widest point, which is just over 200 miles. In the Item Properties tab for the scale bar I turn the opacity down to zero, so the bar doesn’t hide the map features underneath. If the bar matches up I know I’m in good shape. Remember that not all map projections will preserve distance as a property, and distortion becomes an issue for small scale maps that cover large areas (i.e. continents and the globe). On small scale maps distance will be true along standard parallels, and will become less accurate the further away you go.

Plan Your Trip through the Roman Empire with ORBIS

Wednesday, June 27th, 2012

If you wanted to know the fastest route from Roma to Londinium in June of 300 AD or how much it would cost to ride the shortest distance at ox cart speed to Constantinopolis, check out ORBIS. Researchers at Stanford have created a model of Ancient Roman transport networks over land and sea composed of 751 points (cities, landmarks, mountain passes) and thousands of linkages.

The model simulates the average distance of a large group of travelers taking a given route in a given month. The frictions of distance, terrain, climate, and monetary expense are all accounted for in the model and you have the ability to set many of the options. The technical aspects of the project as well as its historical bases are thoroughly documented. The output consists of route maps (which you can download as KML or as CSV) and interactive cartograms. The platform is an open source stack – PostgreSQL with PostGIS, Open Layers, Geoserver, and some JavaScript libraries.

Check it out at http://orbis.stanford.edu.

The fastest route from Roma to Londinium in June? A boat ride across the Mediterranean to Narbo, foot/army/pack animal across southern Gaul, and a coast-hugging boat ride from Burdigala will get you there in 26.6 days and 2,974 kilometers. That carriage to Constantinopolis would cost you about 2,087 denarii and would take 128 days at ox cart speed – perhaps you should consider a fast military march instead?

Mapping Domestic Migration with IRS Data

Friday, November 18th, 2011

Forbes magazine just published a neat interactive map on American migration using data NOT from the Census, but from – the IRS. Whether you fill it out virtually or the old fashioned way, everyone fills in their address at the top of the 1040, and the IRS stores this in a database. If you file from a different address from one year to the next you must have moved, and the IRS publishes a summary file of where people went (with all personal information and practically all filing data stripped away) .

The Forbes map taps into five years of this data and lets you see all domestic in-migration and out-migration from a particular county. The map is a flow or line map with lines going from the county you choose to each target – net in-migration to your county is colored in blue and net out-migration is red. You can also hover over the sending and receiving counties to see how many people moved. Click on the map to choose your county or search by name; you also have the option of searching for cities or towns, as the largest place within each county is helpfully identified and tied to the data.

It’s relatively straightforward and fun to explore. Some of the trends are pretty striking – the difference between declining cities (Wayne County – Detroit MI) and growing ones (Travis County – Austin TX) is pretty vivid, as is the change in migration during the height of the housing boom period in 2005 compared to the depth of the bust in 2009 (see Maricopa County – Phoenix AZ). More subtle is the difference in the scope of migration between urban and rural counties, with the former having more numerous and broader connections and the latter having smaller, more localized exchanges. Case in point is my home state of Delaware – urban New Castle County (Wilmington) compared to rural Sussex County (Seaford). There are many other stories to see here – the exodus from New Orleans after Katrina and the subsequent return of residents, the escape from Los Angeles to the surrounding mountain states, and the pervasiveness of Florida as a destination for everybody (click on the thumbnails below for full images of each map).

Detroit 2009

Wayne Co MI (Detroit) 2009

Austin 2009

Travis Co TX (Austin) 2009

Phoenix 2005

Mariciopa Co AZ (Phoenix) 2005

Phoenix 2009

Mariciopa Co AZ (Phoenix) 2009

Wilmington 2009

New Castle Co DE (Wilmington) 2009

Seaford 2009

Sussex Co DE (Seaford) 2009

While the map is great, the even better news is that the data is free and can be downloaded by anyone from the IRS Statistics page. They provide a lot of summary data – information for individuals is never reported. The individual tax data page with data gleaned from the 1040 has the most data that is geographic in nature. If you wanted to see how much and what kind of tax is collected by state, county, and ZIP code you could get it there. The US Population Migration data used to create the Forbes map is also there and the years from 2005 to 2009 are free (migration data from 1991 to 2004 is available for purchase).

You can download separate files for county inflow and county outflow on a state by state basis in Excel (.xls) format, or you can download the entire enormous dataset in .dat or .csv format. The data that’s reported is the number of filings and exemptions that represent a change in address by county from one year to the next, and includes the aggregated adjusted gross income of the total filers. There are some limitations – in order to protect confidentiality, if the flow from one county to another has less than 10 moves that data is lumped into an “other” category. International migration is also lumped into one interntaional category (on the Forbes map, both the other category where two counties have a flow less than 10 and the foreign migration category are not depicted).

The IRS migration data is often used when creating population estimates; when combined with vital stats on births and deaths it can serve as the migration piece of the demographic equation.

ZIP Code KML Map for NYC Census Data

Saturday, September 10th, 2011

With the release of both the 2010 Census profiles for ZCTAs (ZIP Code Tabulation Areas) and the TIGER line files for 2010 Census geographies, I created another Google Map finding aid for NYC neighborhood data by ZIP code (I previously created one for PUMAs with American Community Survey data). Once again I used the Export to KML plugin that was created for ArcGIS. This allowed me to use the TIGER shapefile in ArcGIS to create the map I wanted and then export it as a KML, while using fields in the attribute table of each feature to insert the ZCTA number into stable links for the census profiles, automatically generating unique urls for each feature. Click on the ZCTA in the map, and then click on a link to open a profile directly from the new American Factfinder.

There were two new obstacles I had to contend with this time. The first was that my department has finally migrated to Windows 7 from Windows XP, and I upgraded from ArcGIS 9.3 to 10. I had to reinstall the Export to KML plugin (version 2.5.5) and ran into trouble; fortunately all the work-arounds were included in the plugin’s documentation. I don’t have administrator rights on my machine, so I had to have someone install the plugin as an administrator; this included running the initial setup file AND running Arc as an administrator as you add and turn the plugin on. That was straightforward, but when I ran it the first time I got an error message – there’s a particular Windows dll or ocx file that the plugin needs and it was missing (presumably something that was included in XP but not in 7). I downloaded the necessary file, and with administrator rights moved it into the system32 folder and registered the file via the command line. After that I was good to go.

The second issue was with the Census Bureau’s new American Factfinder. With the old Factfinder the urls that were generated as you built and accessed tables were static and you could simply save and bookmark them. Not the case in the new Factfinder; you can bookmark some basic tables but most of them are “too complex to bookmark”; you can save and download queries from the online ap but that’s it. After some digging I found a CB document that tells you how you can create deep links to any query you run and table you create. The url consists of a fixed series of codes that identify the dataset, year, table, and geography. So this link:

http://factfinder2.census.gov/bkmk/table/1.0/en/DEC/10_DP/DPDP1/8600000US10010

Tells us that were getting a table from version 1.0 of the American Factfinder in English. It’s from the Decennial Census, 2010 Demographic Profiles, Demographic Profile Table 1, for ZCTA 10010 (860 is the summary level code that indicates we’re looking at ZCTAs). So for the plugin to create the links, I just included this URL but for the last five digits I specified the attribute from the ZCTA shapefile that held the ZCTA code. So when the plugin creates the KML, each KML feature has a link generated that is specific to it:

http://factfinder2.census.gov/bkmk/table/1.0/en/DEC/10_DP/DPDP1/8600000US[ZCTA5CE10]

You can see this previous post for details on how the Export to KML plugin works.

For now, the 2010 and 2000 Census are in the new American Factfinder. The American Community Survey, the Economic Census, population estimates, and a few other datasets are still in the older, legacy Factfinder. According to the CB all of this data will be migrated to the new Factfinder by the end of 2011 and the legacy version will disappear. At that point I’ll have to update my PUMA map so that it points to the profiles in the new Factfinder.

You can take a look at the ZCTA map and profiles below (I’m hosting it on the NYC data resource guide I’ve created for my college). As I’ve written before, ZCTAs are odd Census geographies since they are approximations of residential USPS ZIP Codes created by aggregating census blocks based on addresses; you can see in many instances where boundaries have a blocky teeth-like appearance instead of straight lines. Since they’re created directly by aggregating blocks, ZCTAs don’t correspond or mesh with other census boundaries like tracts or PUMAs, or even legal boundaries like counties. In some cases my assignment of county-based colors doesn’t ring true. For example, ZCTA 11370 includes part of the East Elmhurst neighborhood in Queens and Rikers Island, which is in the Bronx. ZCTA 10463 includes the Bronx neighborhoods of Kingsbridge and Spuyten Duyvil and the Manhattan neighborhood of Marble Hill (a geographic anomaly; it’s not on the Island of Manhattan but it’s part of Manhattan borough).

The most salient issue with ZCTAs is that they are only tabulated for the decennial census and not the American Community Survey; the currency of data and spectrum of census variables will be limited compared to other types of geography. ***NOTE*** This is no longer the case – ZCTA-level data is now available as part of the 5-year ACS, beginning with the 2007-2011 series.


View Larger Map

Some 2010 Census Updates

Monday, February 7th, 2011

Some geography updates to pass along regarding new US Census data:

  • The Census has released a few 2010 map widgets that you can embed in web pages. One shows population change, density, and apportionment for the whole country at the state level, while the other shows population, race, and Hispanic change for states at the county level. As of this post only four states are ready (LA, MS, NJ, and VA) but they’ll be adding the rest once they’re available.
  • The 2010 TIGER Line Files are starting to be released; they’ve changed the download interface a little bit based on user feedback. Most summary levels / geographic areas are available; some (like ZIP Codes and PUMAs) will be released later this year.
  • They’re also rolling out the new interface for the American Factfinder; currently you can get 2000 Census data, some population estimates, and the 2010 Census data as it becomes available. Other datasets like the American Community Survey and Economic Census will be added over time. Some maps and gov docs librarians have expressed concerned about the change – apparently when you download the data from the new interface the FIPS codes are not “ready to go” for joining to shapefiles; there’s one long geo id that has to be parsed. The other concern is that the 1990 Census won’t be carried over into the new interface at all. The original American Factfinder is slated to come down towards the end of this year.

Google Maps to Create a Census Finding Aid

Thursday, May 13th, 2010

Yikes! It’s been quite awhile since my last post (the past couple months have been a little tough for me), but I just finished an interesting project that I can share.

I constantly get questions from students who are interested in getting recent demographic and socio-economic profiles for neighborhoods in New York City. The problem is that neighborhoods are not officially defined, so we have to look for a surrogate. The City has created neighborhood-like areas out of census tracts called community districts and they publish profiles for them, but this data is from the decennial census  and not current enough for their needs.  ZIP code data is also only available from the decennial census.

We can use PUMAs (Public Use Microdata Areas) to approximate neighborhoods in large cities, and they are published as part of the 3 year estimates of the American Community Survey. The problem is, in order to look up the data from the census you need to search by PUMA number – there are no qualitative place names. The city and the census have worked together to assign names to neighborhoods as part of the NYC Housing and Vacancy Survey, but this is the only place (I’ve found) that uses these names. You need to look in several places to figure out what the PUMA number and boundaries for an area are and then navigate through the census site to find it. Too much for the average student who visits me at the reference desk or emails me looking for data.

My solution was to create a finding aid in Google maps that tied everything together:

View Larger Map

I downloaded PUMA boundaries from the Census TIGER file site in a shapefile format. I opened them up in ArcGIS and used an excellent script that I downloaded called Export to KML. ArcGIS 9.3 does support KML exports via the toolbox, and there are a number of other scripts and stand-alone programs that can do this (I tried several) but Export to KML was best (assuming you have access to ArcGIS) in terms of the level of customization and the thoroughness of the user documentation. I symbolized the PUMAs in ArcGIS using the colors and line thickness that I wanted and fired up the tool. It allows you to automatically group and color features based on the layer’s symbology. I was able to add a “snippet” to each feature to help identify it (I used the PUMA number as the attribute name and the neighborhood name as my snippet, so both appear in the legend) and added a description that would appear in the pop up window when that feature is clicked. In that description, I added the URL from the ACS census profile page for a particular PUMA – the cool part here is that the URL is consistent and contains the PUMA number. So, I replaced the specific number and inserted the [field] name from the PUMAs attribute table that contained the number. When I did the export, the URLs for each individual feature were created with their PUMA number inserted into the link.

There were a few quirks – I discovered that you can’t automatically display labels on a Google Map without subterfuge, like creating the labels as images and not text. Google Earth (but not Maps) supports labels if you create multi-geometry where you have a point for a label and a polygon for the feature. If you select a labeling attribute on the initial options screen of the Export to KML tool, you create an icon in the middle of each polygon that has a different description pop-up (which I didn’t want so I left it to none and lived without labels). I made my features 75% transparent (a handy feature of Export to KML) so that you could see the underlying Google Map features through the PUMA, but this made the fill AND the lines transparent, making the features too difficult to see. After the export I opened the KML in a text editor and changed the color values for the lines / boundaries by hand, which was easy since the styles are saved by feature group (boroughs) and not by individual feature (pumas). I also manually changed the value of the folder open element (from 0 to 1) so that the feature and feature groups (pumas and boroughs) are expanded by default when someone opens the map.

After making the manual edits, I uploaded the KML to my webserver and pasted the url for it into the Google Maps search box, which overlayed my KML on the map. Then I was able to get a persistent link to the map and code for embedding it into websites via the Google Map Interface. No need to add it to Google My Maps, as I have my own space. One big quirk – it’s difficult to make changes to an existing KML once you’ve uploaded and displayed it. After I uploaded what I thought would be my final version I noticed a typo. So I fixed it locally, uploaded the KML and overwrote the old one. But – the changes I made didn’t appear. I tried reloading and clearing the cache in my browser, but no good – once the KML is uploaded and Google caches it, you won’t see any of your changes until Google re-caches. The conventional wisdom is to change the name of the file every single time – which is pretty dumb as you’ll never be able to have a persistent link to anything. There are ways to circumvent the problem, or you can just wait it out. I waited one day and by the next the file was updated; good enough for me, as I’ll only need to update it once a year.

I’m hosting the map, along with some static PDF maps and a spreadsheet of PUMA names and neighborhood numbers, from the NYC Data LibGuide I created (part of my college’s collection of research guides). If you’re looking for neighborhood names to associate with PUMA numbers for your city, you’ll have to hunt around and see if a local planning agency or non-profit has created them for a project or research study (as the Census Bureau does not create them). For example, the County of Los Angeles Department of Mental Health uses pumas in a large study they did where they associated local place names with each puma.

If you’re interested in dabbling in some KML, there’s Google’s KML tutorial. I’d also recommend The KML Handbook by Josie Wernecke. The catch for any guide to KML is that while all KML elements are supported by Google Earth, there’s only partial support for Google Maps.

Track 2010 Census Participation Rates

Tuesday, March 30th, 2010

The 2010 Census is in full swing – the target date of April 1st is coming up soon. I mailed my form back last week. If you’re curious as to how many others have mailed theirs back, check out the bureau’s interactive Take 10 Map. Built on top of a Google Map interface, it allows you to track participation rates by state, county, place, reservation, and census tract. You can zoom in to change the scale and select different geography, or enter a zip code, city, or state to zoom to an area of choice. Clicking on an area will display it’s participation rate to date, compared to the state and national rates.

Data is updated daily, Monday through Friday. Once you click on a particular area, if you click the Track Participation Rate link it will create a widget that you can embed in a website to provide the updated rate. Unlike a lot of the other interactive web maps floating around these days, the bureau does give you the ability to download the actual data behind the map, if you want to do some analysis of your own.

Reading List for Geographic Information Course

Saturday, August 29th, 2009

The fall semester is here, and I’m about to start teaching the class I mentioned in my last post (an information studies course on geographic information). I thought I’d share my reading list and try out the Open Book plugin. I chose my readings based on: my particular audience (undergraduate students from many disciplines with little or no background in geography), relevance (materials appropriate in a hybrid information studies / geography course), cost (wanting to assign the students a single textbook that’s reasonably priced and covers all the bases, and will supplement with other readings), and copyright (staying within the bounds of fair use by not assigning too much from a single work). Here goes:

[openbook]1593852002[/openbook] I decided to go with Krygier and Woods Making Maps as my assigned text book. Since cartography is a visual and technical art, I thought it made sense to use a book that relies on visuals for explanations rather than text. It’s approachable, particularly for my students who won’t be coming from a geography background, affordable, wonderfully quirky, and covers all of the essentials of the geographic framework and map interpretation and design independent of specific GIS software.

[openbook]1405106727[/openbook] I’m using the first chapter of Cresswell’s book as a succinct introduction to how individuals define places, but would recommend the rest of the text for classes that cover geographic concepts and methods.

[openbook]026208354X[/openbook] I’m assigning the second and third chapters of Hill’s book. The second chapter, which discusses how people process, store, and use geographic information is the best summary of this topic that I’ve ever seen, and the third chapter is a good overview of the different types of geographic objects. As a librarian-geo nerd, I love the chapters that deal with coordinate metadata and gazetteers, but won’t be using them in this class.

[openbook]0262620014[/openbook]This is an urban planning / design classic, and I’ll have my students read the summary of Lynch’s city elements (based on his research, Lynch proposed that people mentally break the urban environment down into five types of elements in order to organize and navigate the city: paths, barriers, districts, nodes, and landmarks).

[openbook]0470129050[/openbook]This is the only traditional textbook that I’ll be borrowing from (I actually used it when I was a Freshmen, way back when). While I’m using the previous three books to discuss egocentric places, or how we as individuals conceive of place, I’m using the first chapter of this book to give the students an overview of geocentric places – the formal, defined hierarchy of places that exist in the world – and to introduce them to the concept of regions.

[openbook]0226534146[/openbook]This has become a modern classic and I almost assigned it as a second textbook. I am assigning the chapter on maps for propaganda as a background to our discussion on map interpretation and communication, and will later use the chapter on census maps to talk about the effects of data classification and choice of enumeration units.

[openbook]1934356069[/openbook]This is the only software book that I’ll be using chapters from, so the students have some formal guide for using QGIS (in addition to the QGIS documentation). I’m using the chapters on vector and raster data.

[openbook]1412910161[/openbook]This concise, excellent book deals strictly with the concepts and principles behind GIS. I’m using the chapters on spatial search and geoprocessing, but would recommend the entire book for any GIS course, novice to advanced.

In addition to chapters from these books, I’ll also be using:

  • “Revolutions in Mapping” by John Noble Wilford, National Geographic Feb 1998 – a great overview of the history of cartography
  • USGS GIS poster – if there is such a thing, this is a “web classic” and an accessible intro to GIS
  • One article from a scholarly journal and one article from a mass market magazine to illustrate how geographic research is covered and used
  • And for shameless self-promotion, I summary I wrote about US Census data – In Three Parts

Finally, an honorable mention:

[openbook] 1593855664[/openbook] If I was teaching an introductory GIS course in a geography or earth sciences department, this is certainly the book I would use, and for those of you in that boat I’d recommend checking it out. It does an excellent job of covering GIS principles without being software specific, contains exercises at the end of each chapter, and is well written and affordable. Since the scope of my course is broader than GIS and my audience more general and diverse, I opted to leave it out (but may still assign a chapter).

Print Composer in QGIS – ACS Puma Maps

Sunday, July 12th, 2009

ny_youth_pumasI wrapped up a project recently where I created some thematic maps of 2005-2007 ACS PUMA level census data for New York State. I decided to do all the mapping in open source QGIS, and was quite happy with the result, which leads me to retract a statement from a post I made last year, where I suggested that QGIS may not be the best for map layout. The end product looked just as good as maps I’ve created in ArcGIS. There were a few tricks and quirks in using the QGIS Print Composer and I wanted to share those here. I’m using QGIS Kore 1.02, and since I was at work I was using Windows XP with SP3 (I run Ubuntu at home but haven’t experimented with all of these steps yet using Linux). Please note that the data in this map isn’t very strong – the subgroup I was mapping was so small that there were large margins of errors for many of the PUMAs, and in many cases the data was suppressed. But the map itself is a good example of what an ACS PUMA map can look like, and is a good example of what QGIS can do.

  • Inset Map – The map was of New York State, but I needed to add an inset map of New York City so the details there were not obscured. This was just a simple matter of using the Add New Map button for the first map, and doing it a second time for the inset. In the item tab for the map, I changed the preview from rectangle to cache and I had maps of NY state in each map. Changing the focus and zoom of the inset map was easy, once I realized that I could use the scroll on my mouse to zoom in and out and the Move Item Content button (hand over the globe) to re-position the extent (you can also manually type in the scale in the map item tab). Unlike other GIS software I’ve experimented with, the extent of the map layout window is not dynamically tied to the data view – which is a good thing! It means I can have these two maps with different extents based on data in one data window. Then it was just a matter of using the buttons to raise or lower one element over another.
  • Legend – Adding the legend was a snap, and editing each aspect of the legend, the data class labels, and the categories was a piece of cake. You can give your data global labels in the symbology tab for the layer, or you can simply alter them in the legend. One quirk for the legend and the inset map – if you give assign a frame outline that’s less than 1.0, and you save and exit your map, QGIS doesn’t remember this setting if when you open your map again – it sets the outline to zero.
  • Text Boxes / Labels – Adding them was straightforward, but you have to make sure that the label box is large enough to grab and move. One annoyance here is, if you accidentally select the wrong item and move your map frame instead of the label, there is no undo button or hotkey. If you have to insert a lot of labels or free text, it can be tiresome because you can’t simply copy and paste the label – you have to create a new one each time, which means you have to adjust your font size and type, change the opacity, turn the outline to zero, etc each time. Also, if the label looks “off” compared to any automatic labeling you’ve done in the data window, don’t sweat it. After you print or export the map it will look fine.
  • North Arrow – QGIS does have a plugin for north arrows, but the arrow appears in the data view and not in the print layout. To get a north arrow, I inserted a text label, went into the font menu, and chose a font called ESRI symbols, which contains tons of north arrows. I just had to make the font really large, and experiment with hitting keys to get the arrow I wanted.
  • Scale Bar – This was the biggest weakness of the print composer. The scale bar automatically takes the unit of measurement from your map, and there doesn’t seem to be an option to convert your measurement units. Which means you’re showing units in feet, meters, or decimal degrees instead of miles or kilometers, which doesn’t make a lot of sense. Since I was making a thematic map, I left the scale bar off. If anyone has some suggestions for getting around this or if I’m totally missing something, please chime in.
  • Exporting to Image – I exported my map to an image file, which was pretty simple. One quirk here – regardless of what you set as your paper size, QGIS will ignore this and export your map out as the optimal size based on the print quality (dpi) that you’ve set (this isn’t unique to QGIS – ArcGIS behaves the same way when you export a map). If you create an image that you need to insert into a report or web page, you’ll have to mess around with the dpi to get the correct size. The map I’ve linked to in this post uses the default 300 dpi in a PNG format.
  • Printing to PDF – QGIS doesn’t have a built in export function for PDF, so you have to use a PDF print driver via your print screen (if you don’t have the Adobe PDF printer or a reasonable facsimile pre-installed, there are a number  of free ones available on sourceforge – PDFcreator is a good one). I tried Adobe and PDFcreator and ran into trouble both times. For some reason when I printed to PDF it was unable to print the polygon layer I had in either the inset map or the primary map (I had a polygon layer of pumas and a point layer of puma centroids showing MOEs). It appeared that it started to draw the polygon layer but then stopped near the top of the map. I fiddled with the internal settings of both pdf drivers endlessly to no avail, and after endless tinkering found the answer. Right before I go to print to pdf, if I selected the inset map, chose the move item content button (hand with globe), used the arrow key to move the extent up one, and then back one to get it to it’s original position, then printed the map, it worked! I have no idea why, but it did the trick. After printing the map once, to print it again you have to re-do this trick. I also noticed that after hitting print, if the map blinked and I could see all the elements, I knew it would work. But, if the map blinked and I momentarily didn’t see the polygon layer, I knew it wouldn’t export correctly.

Despite a few quirks (what software doesn’t have them), I was really happy with the end result and find myself using QGIS more and more for making basic to intermediate maps at work. Not only was the print composer good, but I was also able to complete all of the pre-processing steps using QGIS or another open source tool. I’ll wrap up by giving you the details of the entire process, and links to previous posts where I discuss those particular issues.

I used 2005-2007 American Community Survey (ACS) date from the US Census Bureau, and mapped the data at the PUMA level. I had to aggregate and calculate percentages for the data I downloaded, which required using a number of spreadsheet formulas to calculate new margins of error; (MOEs). I downloaded a PUMA shapefile layer from the US Census Generalized Cartographic Boundary files page, since generalized features were appropriate at the scale I was using. The shapefile had an undefined coordinate system, so I used the Ftools add-on in QGIS I converted the shapefile from single-part to multi-part features. Then I used Ftools to join my shapefile to the ACS data table I had downloaded and cleaned-up (I had to save the data table as a DBF in order to do the join). Once they were joined, I classified the data using natural breaks (I sorted and eyeballed the data and manually created breaks based on where I thought there were gaps). I used the Color Brewer tool to choose a good color scheme, and entered the RGB values in the color / symbology screen. Once I had those colors, I saved them as custom colors so I could use them again and again. Then I used Ftools to create a polygon centroid layer out of my puma/data layer. I used this new point layer to map my margin of error values. Finally, I went into the print composer and set everything up. I exported my maps out as PNGs, since this is a good image format for preserving the quality of the maps, and as PDFs.

NY Times Interactive Maps

Sunday, March 29th, 2009

The New York Times has been putting together some great, web-based, thematic maps lately. I thought I’d provide a summary of some of the latest and greatest here.

US Maps

  • Immigration Explorer – Explore foreign born groups for the United States by county, based on the decennial census from 1880 to 2000. Choropleth maps of the largest immigrant group per county and graduated circle maps depicting the size of each group. 3/10/2009.
  • The Geography of a Recession – Choropleth map of the US that shows the annual change in unemployment by county. Lets you filter by county type (urban, rural, manufacturing areas, housing bubbles). 3/7/2009.
  • A Growing Detention Network – a graduated circle map of the US that shows detention centers where people are held on immigration violations, by number of detainees and type of facility. 12/26/2008.
  • Where Homes Are Worth Less Than the Mortgage – State-based US choropleth and graduated circle maps of the housing and debt crisis. 11/10/2008.

NYC Maps

  • New Yorkers Assess Their City – How New Yorkers rate their neighborhood based on quality of life, city services, education, transportation and crime. Based on a large survey of city residents. Choropleth maps of community districts. 3/7/2009.
  • Census Shows Growing Diversity in New York City – Choropleth maps show median rent and median income by PUMAs in 2000 and 2007. An example of mapping 3-year ACS data by PUMAs to show patterns below the county level. 12/9/2008.

World Maps

  • A Map of Olympic Medals – A cartogram of countries based on the number of olympic medals they’ve won for every olympics from 1896 to 2008. Mouse over to get medal counts. 8/4/2008.

Copyright © 2017 Gothos. All Rights Reserved.
No computers were harmed in the 0.442 seconds it took to produce this page.

Designed/Developed by Lloyd Armbrust & hot, fresh, coffee.