After more than half a decade at this, it has finally dawned upon me that instead of downloading the Correlates of War state system membership table, or the Gleditsch and Ward refinement of it, every time I wonder what country “338″ is, it might be easier to upload them to Google:
If you had to take a look at the chart below, what would you say about the overall trend in US defense spending? There’s a bump fairly early on for World War 2, but otherwise it seems to generally increase over time. I’m actually surprised to see that we spend more, in terms of constant US dollars, today than we did at the height of the Korean War, and in fact at any point in US history save World War 2.
The short version:
The longer version:
In 2013, my toolbox looks like this:
- Python for text processing and miscellaneous scripting;
- Python (NumPy/SciPy) for numerical computing;
- Python (Neurosynth, NiPy etc.) for neuroimaging data analysis;
- Python (NumPy/SciPy/pandas/statsmodels) for statistical analysis;
- Python (scikit-learn) for machine learning;
- Excursions into other languages have dropped markedly.
I can’t speak on the relative merits of Python over R, other than a general impression that R has stronger stats but some quirks as a language (pdf), while Python is generally more powerful, but less capable beyond basic statistical tools. I did spend some time trying to learn Python during my last year in graduate school, but it was while I was really still becoming comfortable with R and so I didn’t put much effort into it. Seems like it’s time to head back in that direction again.
I work as a Postdoctoral Fellow in the Ward Lab here at Duke University. The Lab currently consists of Mike Ward, me, and a group of very smart graduate students. There are a lot of exciting projects within the lab, like ICEWS and other work for the US government, but also a broader set of projects by our lab members. One of the things we wanted to do this semester is to publicize this work a little bit more, and to this end we’re taking a new blog live today: Predictive Heuristics.
Sometimes, for whatever reason, you want to plot something fast. Last week I had some coordinates associated with event data that I was hoping were all from Egypt. But the coordinates were for locations that are only indirectly associated with the events I had, so I wanted to do a quick plot to check. The
ggmap package in R makes that pretty easy.
Recently I’ve set up both a PostgreSQL and MySQL server to host databases related to some of our projects in the Ward Lab. I should note that I have no idea what I’m doing, and this is the first time I’ve dealt with databases and how to get them working. It’s been a very humbling experience, although in the end, we now have two different databases that can be accessed remotely from a laptop through R or other tools like Quantum GIS:
# setup connection to database library(rgdal) dsn <- "PG: dbname='db' host='someIP' port='5432' user='me' password='guest'" # Load Afghanistan boundary (source: GADM) state <- readOGR(dsn, layer="afg_adm0") plot(state)