CartoDB and WordPress

I get really excited when you can incorporate maps into any part of your daily life. CartoDB just made it easier to upload web maps to WordPress! I love seeing ways that make web mapping less scary. Check out their blog on WordPress now brings you beautiful maps from CartoDB. Here is a map of the University of Richmond Running Trails, click on the trails to get the distance. Happy mapping everyone!

University of Richmond, 100 Years Ago

justin-lily

Lily Calaycay ’17 3D modeling North Court in Google Sketchup. Photo by Nate Ayers

Imagine what the University of Richmond’s campus was like over a 100 years ago. Chris Kemp, the head of the Discovery, Technology, and Publishing department in Boatwright Library and his team have been working on documenting the history of the college for its’ centennial. They have done a fantastic job archiving and sharing historical artifacts of the university’s past. The DSL has been assisting Chris’s group with georeferencing historical campus maps for an upcoming project, which gives a spatial timeline of the university’s history. Since we had fairly detailed maps of campus buildings, roads and topology, what better way to envision the campus 100 years ago than a physical 3D model?

 

The base for any 3D model is the elevation data. The elevation data came from a 1911 survey map that was georeferenced, then digitized using various tools in ArcGIS. The contour lines were in 25’ intervals and covered all of campus as we know it today. Since this map only had contour lines and no buildings, we had to use a 1925 campus master plan map to obtain building footprints and road lines. Some buildings were deleted to reflect campus around the same time as the 1911 survey. These digitized footprints were then used in Google SketchUp to model basic building features. Buildings were not highly detailed because of the accuracy of the printer. The contour lines were converted to a TIN (Triangulated Irregular Network) to create a surface. The problem with converting the contours to a TIN was that the contour intervals were so large it created an unrealistic surface in certain portions of campus– some hand smoothing was needed in SketchUp before the model was printed to mitigate this. After the contours were converted to a surface they were imported into ArcScene and exported as a VRML file so it could be imported into Sketchup.

Once the TIN surface was exported out of ArcScene, it was passed on to Fred Hagemeister–the Center for Teaching, Learning, and Technologies Research Analyst for 3D printing preparations. Once in SketchUp, the surface model was exaggerated 2x to highlight the topology of campus and given a graduated color ramp to show distinction between high and low elevations. Building footprints were added and the surface was ”graded” to better represent the buildings’ actual existence with the landscape. The buildings and roads were added and colored to add detail and the campus started to take shape. Buildings were colored red, if they still exist, and blue if they are no longer on campus. The final and probably the most daunting task was turning the surface from a sheet into a solid shell. Once a solid shell, it was then scaled down and sectioned into 12 pieces since the printer can only print an “8×8” piece. Below you will see the 3D printing process—from printing, to excavation, and finally, gluing.

Process

With the power of GIS and collaborations between Boatwright Memorial Library, the Center for Teaching, Learning, and Technologies, and the Digital Scholarship Lab we were able to take a paper survey map from 1911 and turn it into a physical 3D model.  Two maps over a decade apart were stripped down to their raw data and rebuilt together to show a glimpse of what the University of Richmond might have looked like over a century ago. Check out Chris’s Blog to learn more about the project.

 

 

DSC_5543

Chris Kemp placing a section of the 3D model. Model is made up of 12 individual sections. Photo by Angie White

DSC_5559

Model once all of the pieces are in place. Photo by Angie White

Richmond Then and Now

Richmond was a very different place a decade after the Civil War than it is today.  When we started working on the Visualizing the Past project for the Library of Virginia, we found a wonderful atlas of Richmond in 1876. This hand painted atlas, published by F.W Beers, features detailed buildings and their owners, parks and public landmarks. These maps served as our basemap for the project because of the great detail they provided. While working on the project I became really interested in what has changed around Richmond since this time.

Richmond Then and Now Scratchoff

Richmond Then and Now Scratchoff

 

 

This application has been updated! Check out the new Richmond Then and Now!

Click this link or the image to explore Richmond Then and Now

Visualizing the Past

Over the past several months, the DSL has been collaborating with the Library of Virginia, and Maurie McInnis, Vice Provost for Academic Affairs and Professor of art history at the University of Virginia on the To Be Sold– Virginia and the American Slave Trade exhibition. Read more about the exhibition below. Our role in the project was to create a 3D visualization of Richmond in the early 1850’s. The 3D visualization is used to help visitors envision Erye Crowe’s journey through Richmond, and experience the slave trade through his paintings and engravings. The models’ intent is not to replicate every detail of Richmond in 1853, but provide a sense of the architectural styles and atmosphere of the city at the time.  This has been a challenge considering the time period and lack of information on such a grand scale. The foundation of any 3D project is footprints, which are a little hard to come by in the 1850’s. We discovered a map made by F.W Beers in 1876 that detailed buildings, parks, and other features quite well. Below is one export from David Rumsey’s Map Collection. We also used these as the basemap for the model.

4623014

F.W Beers map of Richmond published in 1876. Export from David Rumsey’s Map Collection.

The problem with the Beer’s map is that it depicts Richmond 20 years later than decade of interest. To resolve this issue we referenced several maps of Richmond during the 1850’s and adjusted our footprints accordingly.  After georeferencing these images and digitizing the footprints, we started to think about modeling methods for a project like this. With the help of Maurie McInnis and Scott Nesbit, we gathered numerous photos showing buildings and detailed descriptions of material and architectural styles for the time period. We wanted to provide the greatest amount of detail without modeling 3,000+ buildings by hand. For buildings we had photos or descriptions for, our student interns modeled these using Google Sketchup. With the help of Nathaniel Ayers and I, the students modeled more than 30 buildings around Richmond for the 1850s. The students really enjoyed the project and got immersed in the details of the building they were modeling.

Even though the students modeled over 30 buildings we still had at least 3000 yet to model, with no real idea what the majority of them looked like. I’ve heard about CityEngine by ESRI for a while, but have never experimented with it. After reading about the Rome Reborn project I felt it was a perfect solution to our problem. In short, CityEngine uses a procedural modeling approach. By using rule files and GIS data, you can populate a large scale 3D model in a matter of moments.  Maurie helped us a great deal on this portion by providing detailed descriptions of facades and architectural styles found in Richmond at the time.  Both the Sketchup and CityEngine models were exported and brought into 3D Studio Max. Nathaniel Ayers did an outstanding job rendering the buildings, adding trees, and animating the video. Bringing everything into 3D Studio Max gave the model consistency since we used two different software’s to populate the buildings.

This project utilized a combination of modeling approaches, which served us well considering the time period and information available. Procedural modeling allows us to focus on the architectural details of a specific time/ place and apply these styles over a city—giving you a sense of what a city might have looked like during that time. Using this along with traditional/more detailed modeling approaches resulted in a stunning visualization showcasing the architectural features rather than specific building types, all without compromising the rest of the scene because of specific building descriptions.  To see the whole video click here or visit the To Be Sold exhibit at the Library of Virginia starting Monday, October 27, 2014—Saturday, May 30, 2015. Along with the exhibition we hope to present this work at the upcoming ESRI Users Conference this July.

Screen capture showing birds eye view of Richmond  Virginia in 1853.

Screen capture showing bird’s eye view of Richmond Virginia in 1853.

Screen capture showing the Capitol looking west.

Screen capture showing the Capitol looking west.

Birds eye view looking West over the city.

Bird’s eye view looking West over the city.

Screen capture of the American Hotel.

Screen capture of the American Hotel.

 

To Be Sold: Virginia and the American Slave Trade
Monday, October 27, 2014—Saturday, May 30, 2015
Time: 9:00 AM–5:00 PM
Place: Lobby and Exhibition Hall,  Free

This groundbreaking exhibition will explore the pivotal role that Richmond played in the domestic slave trade. Curated by University of Virginia professor Maurie McInnis, To Be Sold will draw from her recent book, Waiting to Be Sold: Abolitionist Art and the American Slave Trade, and be anchored by a series of paintings and engravings by Eyre Crowe, a British artist who witnessed the slave trade as he traveled across the United States in 1853. This internal trade accounted for the largest forced migration of people in the United States, moving as many as two million people from the Upper South to the Cotton South. Virginia was the largest mass exporter of enslaved people through the Richmond market, making the trade the most important economic activity in antebellum Virginia. This exhibition will not be merely a story of numbers and economic impact, but also one that focuses on individuals and the impact that the trade had on enslaved people.

A great open source tool for any GIS user

Before my time at the University of Richmond, the only mention of open source software was from computer science majors and programmers. Open source naturally seemed intimidating, since it was something new and mysterious, and in my mind sub-par to it’s proprietary counterpart. During my education, both universities I attended had a ESRI site license and rarely touched on open source GIS tools or software. The DSL has always leaned towards using open source software and its philosophy, which has been a great learning experience for me. I have begun to see the benefit in using both open source and proprietary software depending on the task.

While attending the VAMLIS conference last week, I attended a workshop presented by Jonah Adkins, a Sr. GIS Analyst for GISi. It was a great overview of open street map with some very useful open source tools on the web. Working with historical data, we tend to digitize a lot of polygons at the DSL. When serving these polygons up on the web it’s nice to generalize these for enhanced viewing optimization.  I have always had trouble simplifying polygons in Arc effectively and have dreaded this process every time I need to generalize a new data-set. Below is a great time saving tool for any GIS user struggling with simplifying polygons and wondering what tolerance levels to use.

When Jonah showed Mapshaper, “A tool for topologically aware shape simplification. Reads and writes Shapefile, GeoJSON and TopoJSON formats” tool at the conference, I was ecstatic! Not only can you upload a Shapefile, you can see in real time the simplification and the percent change from the original polygons.  It is one of the most simple, yet effective tools I have ever used. It is as simple as this.

1. Click on the link above. 

2. Configure your setting and upload file.

3. Slide simplify bar to simplify.

4. Export to Shapefile, GeoJSON and TopoJSON

5. Receive zipfile and enjoy simplify polygons.

6. Repeat!

 

 

Untitled-1

 

So don’t be hesitant of open source options like I was. They can make your life a lot easier and broaden your knowledge while simplifying your workflows. Using a mix of proprietary and open source software/tools can be a powerful combination!

 

 

 

What is Value-by-Alpha anyway?

As some of you know, we are currently producing a Digital Atlas of American History. While working on one of the maps for the Atlas, I was searching for a better way of showing foreign-born population other than your run of the mill choropleth map. I stumbled upon a paper written by Robert Roth, Andy Woodruff, and Zachary Johnson titled “Value-by-alpha Maps: An Alternative Technique to the Cartogram.” You can read more about it in Andy Woodruff’s blog. After reading the paper and Andy’s blog I got really excited about trying this for our foreign-born population map.

Though choropleth mapping and area cartograms are two of the most common techniques for mapping thematic variables such as foreign-born population, each have significant drawbacks. Choropleth maps fail to distinguish between areas of high and low population. Area cartograms address that issue but can be difficult to interpret given the spatial distortions they introduce. Roth et al. (2010) have developed a new method for developing thematic maps: value-by-alpha mapping.  In the case of foreign-born population maps the value-by-alpha technique uses varying opacities to highlight areas of high population density and deemphasizes areas of low population density. This equalizes foreign-born population based on density and show the percentage of population in each county that were born outside the US—all while preserving both shape and topology. Utilizing this method for foreign-born population effectively highlights high density areas with large foreign-born populations, showing patterns that would likely be missed with traditional choropleth or area cartogram mapping techniques. 2010_foreign_born

My first attempt involved using ArcGIS. Achieving this in Arc is a little problematic, but nonetheless you get a pretty cool map. My one issue is that Arc really limits you on the color range and transparency values you can assign to a layer without some finagling. Andrew Wheeler has a great tutorial in his blog about how to do this in ArcGIS. I found a way to get around these limits in Arc but only after going with my second attempt. To do this you need to calculate in the attribute table a transparency value assigned to each value based on population density and assign that calculated value using the Display Expression tab under Layer Properties. Here are the results using the method outlined in Andrew’s Blog.

The second attempt involved using Leaflet and JavaScript. After discussing the first method with our director Rob, we decided this would be better achieved using JavaScript and Leaflet. JavaScript allows you to give each individual population density value a transparency and color value, where Arc makes you clump these into categories. Rob helped out a great deal with this method since my programs skills are very minimal. This gave us the greatest detail and really highlighted areas of high population density and high foreign-born population. The percent foreign born value is equalized by population density using alpha channels. This visually weights the map and neutralizes areas with low population density. Here is the same map as above but using JavaScript. 2010 Outlined

 

Though their are a couple of ways to do the Value-by-Alpha method, we found that the JavaScript approach gave us the most granular results and really conveyed what we were trying to show with the foreign-born data. Below is the final poster we presented at this years VAMLIS Conference.

 

Main Poster 04.psd

%d bloggers like this: