The data used in City Browser was taken from a few sources provided by different US government agencies. I put the data into a manageable JSON using web scraping and by processing data files. The code for generating the JSON was written in Python and can also be found at this GitHub page.
|NOAA provides climate data for each station it keeps track of in text files. First, the script determined the nearest station to the coordinates of the city on Wikipedia. NOAA provides pre-calculated data on the monthly normal (mean) highs and lows for each station over a 30-year period. To determine the mean maximum and minimum temperatures, the script read the daily weather for each station for as long as it has been available, and averaged each month's maximums and minimums over all the available years.|
|Population||Wikipedia via U.S. Census Bureau||HTML||
2015 Population Estimates
|The U.S. Census Bureau makes estimates of the current year's population for each city in between census years. The Census Bureau's website was difficult to scrape data from so I scraped the equivalent 2015 data from the Wikipedia page.|
|Median rent||Office of Policy Development and Research||Spreadsheet||50th Percentile Rent Estimates||An office of the U.S. Department of Housing and Urban Development, the Office of Policy Development and Research produces yearly estimates of the median rent for areas of the United States. The script would first search the spreadsheet for the name and state of the city. If it could not find the specific city, it would fall back to use the county the city is within.|