Google Maps Twitter Mashup

Google Maps
The map data you see inside Google Maps comes from multiple sources, including user-provided data, but is primarily compiled by Tele Atlas, a global leader in navigation and location-based services. For our mashup we will be combining Google Maps with Twitter. Therefore all of the map data will be called from Google’s database. We will not be downloading any GIS data, it is all essentially in the cloud. The specific sources of the map data my be found in the lower right corner of the map.

Add Google Maps to your website
In order to add the full interactivity of google maps to your website you need to obtain an API Key. Most providers of this kind of data or service require a key in order to control who uses the data, how, and how much. This is the limitation of using someone else’s database versus using your own. To create your API key:

1. Visit the APIs Console at https://code.google.com/apis/console and log in with your Google Account.
2. Click the Services link from the left-hand menu.
3. Activate the Google Maps API v3 service.
4. Click the API Access link from the left-hand menu. Your API key is available from the API Access page, in the Simple API Access section. Maps API applications use the Key for browser apps.

Google provides a list of all of the commands you can implement into your map to accommodate usability.

This is what the code behind the placed map looks like.

<!DOCTYPE html>
<html>
  <head>
    <meta name="viewport" content="initial-scale=1.0, user-scalable=no" />
    <style type="text/css">
      html { height: 100% }
      body { height: 100%; margin: 0; padding: 0 }
      #map_canvas { height: 100% }
    </style>
    <script type="text/javascript"
      src="http://maps.googleapis.com/maps/api/js?key=YOUR_API_KEY&sensor=SET_TO_TRUE_OR_FALSE">
    </script>
    <script type="text/javascript">
      function initialize() {
        var myOptions = {
          center: new google.maps.LatLng(-34.397, 150.644),
          zoom: 8,
          mapTypeId: google.maps.MapTypeId.ROADMAP
        };
        var map = new google.maps.Map(document.getElementById("map_canvas"),
            myOptions);
      }
    </script>
  </head>
  <body onload="initialize()">
    <div id="map_canvas" style="width:100%; height:100%"></div>
  </body>
</html>

Notice the lat, long reference. This particular example is centered on Sydney, Australia. The zoom function is self explanatory and is one of the functions listed above. Now that we have the base map it’s time to implement the twitter feeds.

Accessing the Twitter API
http://search.twitter.com/search.json?rpp=25&geocode=41.881256,-87.62485,20mi&callback=?
Broken down:
q=NATOChicago           search term
rpp=5                              the number of tweets to return
geocode                          return tweets within a given radius of a given lat,long

Beginning in 2009, tweets have had the capability of being georeferenced.

When placed into a browser, this is what is spit out:

According to this, the following are the key parameters we need:

  • object -> results (array) -> from_user
  • object -> results (array) -> location
  • object -> results (array) -> geo
  • object -> results (array) -> profile_image_url
  • object -> results (array) -> text

Displaying the Tweets
Now that we have the tweets, we need to map them.

//Function to get data from Twitter
ryan.getTwitter = function()
{
    removeLayer();
    bounds = new google.maps.LatLngBounds ();
    $.getJSON('http://search.twitter.com/search.json?rpp=25&amp;geocode='+map.getCenter().lat()+','+map.getCenter().lng()+',20mi&amp;callback=?',
        function(data)
        {
            $.each(data.results, function(i,item){
                if (item.geo == null)
                {
                trace(i + ' no geo data');
                }
                else
                {
                infoWindowContent = '<strong>'+item.from_user+'</strong><br>';
                infoWindowContent += '<img src="'+item.profile_image_url+'"></a><br>';
                infoWindowContent += ''+item.text+'';
ryan.createTwitterMarker(i,item.geo.coordinates[0],item.geo.coordinates[1],infoWindowContent,item.profile_image_url);
                }    
            });
        });
}

If the tweet is geocoded and info window will display the user name, profile pic and the text. However, most Twitter users have no opted into having their tweets tagged. We will address that later.

Mapping the Tweets

//Function to create Twitter Marker
createTwitterMarker = function(i,latitude,longitude,infoWindowContent,icon)
{
    var markerLatLng = new google.maps.LatLng(latitude,longitude);  

    //extent bounds for each Tweet and adjust map to fit to it
    bounds.extend(markerLatLng);
    map.fitBounds(bounds);
    var image = new google.maps.MarkerImage(icon, null, null, null, new google.maps.Size(32,32));

    twitter[i] = new google.maps.Marker({
        position: markerLatLng,
        map: map,
        title: infoWindowContent,
        icon: image
        });

    //add an onclick event
    google.maps.event.addListener(twitter[i], 'click', function() {
        infowindow.setContent(infoWindowContent);
        infowindow.open(map,twitter[i]);
        });
}

Creating a function to Geocode the results

The following code creates a variable to geocode the results and addresses the non-geotagged tweets by using the location value which is the city in which the tweet was received.

//Function to get data from Twitter
ryan.getTwitter = function()
{
    removeLayer();
    bounds = new google.maps.LatLngBounds ();
    $.getJSON('http://search.twitter.com/search.json?rpp=25&amp;geocode='+map.getCenter().lat()+','+map.getCenter().lng()+',20mi&amp;callback=?',
        function(data)
        {
            $.each(data.results, function(i,item){
                if (item.geo == null)
                {
                infoWindowContent = '<strong>'+item.from_user+'</strong><br>';
                infoWindowContent += '<img src="'+item.profile_image_url+'"></a><br>';
                infoWindowContent += ''+item.text+'';
                ryan.geocodeTwitter(i,item.location,infoWindowContent,item.profile_image_url);
                }
                else
                {
                infoWindowContent = '<strong>'+item.from_user+'</strong><br>';
                infoWindowContent += '<img src="'+item.profile_image_url+'"></a><br>';
                infoWindowContent += ''+item.text+'';
ryan.createTwitterMarker(i,item.geo.coordinates[0],item.geo.coordinates[1],infoWindowContent,item.profile_image_url);
                }    
            });
        });
}

Now the geocoder function:

ryan.geocodeTwitter = function(i,genLocation,infoWindowContent,icon) 
{
    var geocoderRequest = {address: genLocation}
    geocoder.geocode(geocoderRequest, function(results, status) {
        if (status == google.maps.GeocoderStatus.OK) 
        {
ryan.createTwitterMarker(i,results[0].geometry.location.Da,results[0].geometry.location.Ea,infoWindowContent,icon);
            trace(i + ' lat/long');
            trace(results[0].geometry.location);
        } 
        else 
        {
            trace("Geocode was not successful for the following reason: " + status);
        }
    });
}

The above function will run the geocode parameters of the tweets through Google’s geocoder function which is designed to place a marker on a map using the twittermarker function. The problem is that since most tweets are not geocoded this will result an a group of tweets stacked on top of one another.

Conclusion
My precedent studies touted the value of geocoded social media in times of natural disasters and man-made crises such as wars. However, given the lack of participation by most users, the value of this proposal is severely limited. Therefore tracking tweets may not be the best way to track larger movements of people. One may try tracking something which is more likely to have geocoordinates automatically assigned such as photos. Flickr would be a natural next step.

BUS ROUTE 29 2000 TO 2010 COMMUNITY VITALITY

THE HUNCH:

The Chicago Housing Authority severely affected community vitality by demolishing Housing Projects between 2000 and 2010.

Community Vitality measured with:
CTA ridership
Per Capita Income
Population

It is difficult to draw a correlation between the CHA initiative of public housing destruction and any significant changes in neighborhood quality.

The values are too divergent to draw a conclusion. However, this was a great exercise for learning how to use object-relational database management system, generate spatial queries and effectively render geographic information all with open-source software!!

ARCH_497_S_MCCREE_S_2012

A STUDY IN CHICAGO RECYCLING SYSTEM

by Soledad Hernandez & Pinar Dursun

Problem

The millions of tons of waste disposed of into our environment every year. As urban growth continues to take hold in many cities, our levels of all types of waste, combined with the problems created when it comes to disposing of them, are constantly increasing. In front of this situation, an efficient management waste system can solve a basic problem in the cities.

Chicago generates 7,299,174 tons of waste every year and residents recycle just more than 200,000 tons of materials per year.

Chicago has two recycling systems: Blue Cart and Drop-off

The goals

The aim of this project is to analysis how is the recycling system en Chicago

Examining the effectiveness of Chicago Recycling System.

How the recycling system of Chicago can be improved?

Recycling amount distribution by location.

Correlation between recycling amount and demographical information.

Process

After preparing the spreadsheets of the data and uploading them in iituim server database via PostgreSQL software as tables, shape files were imported to QGIS software where the contacts between the amounts and locations were made. Thus, equations for analysis were created.

Maps

Conclusion

22 wards have neither drop-off center nor blue cart system. 1,186,364 people living in this wards without any recycling service.

It is obvious that blue cart system is more efficient than drop-off centers due to its easiness. Travelling miles to throw the recyclables into the drop-off center instead of putting them into the blue cart in front of their house is a dissuasive effect for the people who do not live in blue cart covered neighborhoods.

It also seems like the people living in the north neighborhoods are more eager to recycle. Northwards can be proposed for the location of blue cart area future expansion.

For complete presentation with all the maps: RECYCLING 5.4.12

Case Study: Noise Maps

A noise map is a graphic representation of the sound level distribution. Noise maps are used for calculating the areas affected by noise, determining the number of sensitive buildings affected by high noise levels, getting noise prediction models. The spatial database and spatial analyzing tools of GIS is useful to monitor the effect of noise and its impact.

European Union Member States are required to produce strategic noise maps in their main cities.

What are the necessities for a noise map?

3D city model of the area

Software packages: ARCVIEW/GIS, (ArcMap, Spatial analyst, 3D analyst, and ArcScene extension), standard noise calculation software, Point Cloud Mapper (PCM), FIELDS

The steps for 3D noise maps:

Step 1: Collection of data

Step 2: Building 3D city model, extracting and organizing the data about the objects of the 3D city model for noise calculation.

Step 3: Generating the 3D noise observation points and building 3D noise model.

These observation points represent location of virtual microphones where the noise levels are to be calculated. The acoustic indicators can be determined by computation or measurement methods. However, computation methods are widely preferred. Noise levels are calculated at each observation point by using noise calculation software.

When results are obtained, spatial interpolation was applied to give a continuous graphical representation of sound levels by using GIS tools.

3D noise map shows the volumetric view of noise levels on the road surface of study area.

These are the 3D noise map and noise contours of the Delft. Inhabitants on lower floors are more affected than on upper floors.

The size and position of noise barriers can be decided most optionally using 3D noise models. Higher barriers located close to the road are more effective to prevent the noise.

For more information about noise maps:

http://www.navcon.com/citynoisemap.htm

For Delft noise map:

http://www.gem-msc.org/Academic%20Output/Kurakula%20Vinay.pdf

For Paris noise map:

http://www.v1.paris.fr/commun/v2asp/fr/environnement/bruit/carto_jour_nuit/cartobruit.h

Fire Safety in High Rise Building

Chicago is full of the sky scraper and fire safety is one of the biggest issue in high rise building.  Fire poses a particularly serious threat in high-rise buildings.  First, it is difficult for fire fighters to reach the upper floors; for example, the highest fire truck ladder in Chicago only extends to the eight floors.  To extinguish blazes above that point, fire fighters must sometimes climb dozens of flights of stairs, dragging fire hoses and other heavy equipment with them.

Second, large buildings populations are difficult to rapidly and safely evacuate.  Since elevators do not offer a safe means of exit during a fire, thousands of people may be forced to descend crowded stairs. But the dangers are intensified in the noise, smoke darkness, and confusion of a high-rise fire, particularly for those attempting to escape from an upper floor.

 My idea behind this study is to create a building model based on cityGml and GIS  which can be helpful to emergency crew in emergency. Technologies such as GIS and CityGML can be used for analyzing and visualization of spatial data and 3d city model which is helpful in planning, simulation and training.

Data is available in different from following sources and it poses challenges for extracting useful information

  • Different sources of Data: city database. owner, architectural firms, emergency communication services
  • Different types of Data: Tabular data, spatial data, 3D models, shape files, floor plans etc.or Google Floor Plan Project
  • Different Format of Data: point, Date, co-ordinates etc
  • Various non-consistent schemas of Data, Eg, Some data is available in GityGML and other in GIS.

In order to maintain high integrity among various data sources we have to model our data based upon standards.  After that we can import our data into a common data store.  In case of building data, it can be model according to IFC standard or CityGML  standard for 3D building model and 2D data can be model in GIS format.

After we model data we use Data Mining techniques to extract useful information from our data. We can use different AI techniques to find different uses of our data.

UML diagram-Building Model

Scenario -1

  • If a Fire is detected on multiple floors of a high rise building then AI can help fire fighters to create smart evacuation plans on the fly using digitized floor plans and building sensors.

Scenario- 2

Planning of rescue operation

  • Eg, which window on 5th floor is accessible by fire ladder or where are building with large roof for helicopter landing.

Well, The newly unveiled concept project of Google, Google Glass, would be an ideal technologies useful for fire fighters.

Google Glasses Preview

a. Benefits of using Google Glass would be that fire fighters in case of emergency don’t want to look at another screen which is their phone, tablet or anything else.  Having a head mounted display would help them to focus on their jobs rather than on technology.  Moreover, this technology would also free their hands from using a device and they could help people.

b. Google Glasses technology  uses natural language processing which enables people to talk to a technology in human language rather than machine language.  Hence this kind of technology would be much friendlier to use, easier adoption rate, etc.

Tablet and Augmented reality Apps

a. Tablets with newer form factors and newer technologies could come very handy for fire fighters working in the field.  Having all the data they need on their tablet could help them make decisions quick which help save lives.

Vision of a building model using augmented reality

In conclusion, with use of urban information modeling and augmented reality we can not only reduce response time from fire station to disaster site, but also provide effective and safe escape from building and rescue operation.

With the use of CityGML and augmented reality, simulation of event can be useful to train new personnel such as fire fighters, police men, etc

Fire Simulation for Training

Urban Information Modeling Platform

This is a Final Project presentation for the Urban Information Modeling class at the Illinois Institute of Illinois. Both authors were interested in an interactive platform to view and create 3D building models. Originally, it was just meant to be another way for citizens to take part in the modeling of their cities in the United States and eventually across the globe.

As the project evolved, increased user participation and the integration of 2D geographic information became key elements. The capability to break down users into subgroups and allowing them to edit models on top of uploading originals was significant. Attached is the presentation with a complete outline of how the platform would be organized and displayed.

Chicago Urban Migration-Wigdan Al-Guneid,Sakshi Aggarwal

OBJECTIVE

Obtain the relationship between the quality of neighborhood and the migrations of people of people to them. Through the analysis of attributes such as health care, education, transportation accessibility, and crime rate, we were able to indicate the popularity of a neighborhood and wither some attributes are more important than others in ranking the value of a property. Our analysis will be focusing on the relationship between housing value in 6 main neighborhoods from different location of Chicago and amount of people who relocated from or to them.

The neighborhoods chosen are:

-China Town (District 31)

-Hyde Park (District 41)

-Lincoln Park (District 7)

-Logan Square (District 22)

-Garfield Ridge (District 56)

1- The data available with us should be translated into maps that illustrate the quality score for each of the chosen neighborhoods.

2- The analysis of the data should be driven from maps that are done using the CSV files, and shape files downloaded from Chicago website.

Using the Software QGIS, PG Admin will help in driving maps with analysis of Chicago neighborhoods.

In order to understand the pattern of migrations within the neighborhood’s of Chicago, the study had to include a method of scoring the quality of a neighborhood. A quality would be driven from series of calculations and equations that will be translated later into queries in PQ Admin II then into maps.

EQUATIONS

School Equation

School safety score+ family involvement score+ Parent engagement score+

College eligibility+ college enrollment

______________________________________             = School Score

5

Hospital Equation

Accessibility of hospitals+ number of hospitals

______________________________________= Hospital Score

2

Accessibility Equation

School safety score+ family involvement score+ Parent engagement score+

College eligibility+ college enrollment

___________________________________________________________= Accessibility Score

5

Neighborhood score Equation

Home value index +School Score +Hospital Score + Accessibility Score +

___________________________________________________________= Quality of

Neighborhood

4

TABLES & MAPS

After   checking the accuracy of the data in the tables that are done with POST ADMIN II, queries were made to connect the tables we have to the QGIS .This step is important in order to match the data to the shape file of the neighborhoods of Chicago city.

After making the maps, analytic comparison driven from the visual information in the maps, and also from the value of the quality averages driven from the neighborhood’s score equations.

Summary:

The quality scores showed that the higher quality neighborhoods were Lincoln Park, followed by Logan Square, Garfield Ridge, Hyde Park, China Town, and Avalon Park.

This conclusion however, matched the Real Estate value of Lincoln Park and Logan Square, and Avalon Park only. Hyde Park and China Town had different real-estate values that didn’t correlate necessarily to the neighborhood quality.

This led us to a conclusion that there are other reasons such as ethnic backgrounds of populations living there. China Town is a neighborhood that is heavily populated with Asian communities, Same as Hyde Park that is also heavily populated with African Americans.

The study was only focusing on amenities that a neighborhood would offer and if they would affect the decision of selecting a neighborhood or not.