4/26/13 - Ross Lieb-Lappen, Salty Snow and Radical Reactions: The Activation of Bromide in the Sea Ice Zone
Polar tropospheric ozone depletion events (ODE) are an early springtime phenomena strongly correlated with increased concentrations of reactive bromine gases (BrO and Br). However, the mechanism by which Br enters the atmosphere is not well understood. This research focuses on the transport of Br, which is hypothesized to enter the atmosphere via blowing snow over first-year sea ice. Using ion chromatography, x-ray micro-computed tomography, and x-ray micro-fluorescence, we aim to identify the microstructural and stratigraphic location of Br and other salts in the snow and ice.
4/19/13 - Owen Myers, The Electric Curtain
The Electric Curtain (EC) has been proposed for particle manipulation and control in various applications. The one studied by us consists of a series of parallel electrodes embedded in a thin dielectric surface; they are driven by electric potentials of a specific frequency and prescribed phase difference between neighboring electrodes. In this paper, the nonlinear dynamics of a particle driven by the electric field generated by a two-phase EC are discussed. We focus on the particle dynamics dependence on the two parameters A and β, the nondimensionalized field amplitude and damping respectively. The possible dynamics are presented by showing attractors and their basins over the variations of the parameters A and β by using time maps. We present our results separately for two different regimes of particle motion in an EC field: (1) one dimensional particle motion (2) particles levitated above the surface and bouncing along the surface.
3/22/13 - Suma Desu, Finding Stories
Here we propose an automated method of finding stories. Using a partitioner that finds meaningful phrases and measures of conditional entropy and ambient valence, we hope to track stories and watch them spread in a data driven manner. We are just now starting to create the infrastructure to allow this kind of work. More than a talk this will be a brainstorming session, and we would love any input!
2/15/13 - Morgan Frank, A Dimensional Analysis of the Lorenz '96 Model
The Lorenz '96 Model is a popular model for testing prediction techniques for dynamical systems, like the atmosphere, due to its exponential error growth. The model is comprised of two coupled systems each representing atmospheric measurements about a given latitude where the first system consists of some number (I) of slow-time observable oscillators, and the second system consists of some number (J) of fast-time unobservable oscillators per slow oscillator. By examining different choices for I and J, we can examine the different dynamics of the Lorenz '96 Model in a novel way. We find that despite the model's popularity as a chaotic system, there exist dimensions in which the system dynamics exhibit highly predictable behavior in the form of standing waves.
11/16/12 - Eric Clark, Measuring the Information Content of Words in a Corpus
Salvaging the underlying message from a text in a data-driven way is a challenging problem in Computer Science and Computational Linguistics. When considering the amount of information carried by a particular word in some corpus, one approach is to consider all of the specific contexts that the word can appear. An N-gram distribution is a frequency distribution of length N phrases across some corpus. Using both the Google Web Crawl and Google Books N-Gram Distributions the amount of information required to encode a word can be statistically approximated. The words that are being analysed in this project were originally constructed to measure the valence of the most frequently used 10,000 words in English. Using these word-sets and N-gram corpora, the relationship between a word's valence and the amount of information it carries is explored.
10/26/12 - Mike Foley, Evolving trading strategies in a toy agent based market model
Finance has become a major staple of our economy, and now contributes to to over 10% of the United States GDP. Hundreds of billions of dollars are spent each year on building models, developing trading algorithms, and slicing microseconds off of trade execution time, all in the hope of having the next winning strategy. In an agent based model implementing a genetic algorithm, we look to explore relationships between competing trading strategies, and answer such questions as: What types of strategies have lasting power? Which strategies are sure losers? How does the market environment affect the space of successful strategies? What, if any, is really the difference between being lucky and being good?
10/12/12 - Andy Reagan, Understanding the natural structure of human logic
The structure with which the human brain stores information has been shown to correlate with intelligence, and this structure forms in childhood. Mathematics is a complex system and stores information among many results, that together can have unexpected consequences. Here, we look at a network of logical implication in mathematical analysis, as an analog to the structure with which humans create structured information. We see first that the most important, and complex, statements of logic are supported by smaller ideas. Further investigations into the structure of this network are considered.
9/21/12 - Josh Auerbach, Automated evolution of interesting images
Recent work (Secretan et al., 2011) has demonstrated that it is possible to evolve novel and interesting images through interactive evolution. However, interactive evolution is a slow process that requires the active involvement of human users. It is desirable to evolve interesting images without requiring human users to perform selection. In this work I explore alternate methods of evolving interesting images that are completely automated, yet in some cases still indirectly informed by what humans find interesting.
9/14/12 - Bill Gottesman, A Power Size Law Model for Philanthropic Behavior
Philanthropic Gifting appears to follow a power-law behavior, as measured from both the size of gifts made by donors, and the size of gifts received by individual charities. Gammas among individual charities differ widely, but are relatively constant within specific classes of charity (for example, education or health). From analysis of data from multiple charities, we can construct a model of how a donor's income affects their preferences for giving to different classes of charities. We can also develop charity-specific guidelines for fundraising, and some feedback analysis of the success of campaigns as they are underway.
4/27/12 - Nick Cheney, Investigations into Morphological Scaffolding
Evolutionary robotics uses ideas from evolution to automatically design robots: poorly-performing robots are discarded; modified copies of the remaining robots are made; and some of these new robots perform better than their 'parents', incrementally improving the population. Usually though only the brains of the robots are optimized. Previous studies suggest that morphological change and the evolution of development are two important processes that improve the evolution of robust behaviors for robots, in addition to in animals. However, there have been few previous attempts to investigate this phenomenon. Here I create another setup in which to explore it, where a driving vehicle changes in order to gain the physiology for walking. I then expand upon a previous study in which an evolving, simulated robot begins as a slithering anguilliform and develops the physiology for walking during its lifetime. In regard to this study, I automate the highly sensitive time line of this development, thus removing the assumptions and parameters associated with this previously hand designed feature. I then go on to propose a new analytical measure of the level of effectiveness of this entire process of morphological change, based on the results of this automation procedure.
4/20/12 - Catherine Bliss, Twitter reciprocal reply networks exhibit assortativity with respect to happiness
Several recent studies have examined the topology of a network inferred from Twitter users and their “followers.” We argue against the use of networks inferred from following behavior and propose an alternative construction based on reciprocal replies. In this study, we take a more dynamic view of the network, accounting for the “unfriending problem” and detecting temporal network changes at the level of days, weeks and months. Furthermore, we construct a null model to test the eect of network topology on the assortativity of happiness for 40 million message pairs posted between September 2008 and February 2009. We find users’ average happiness scores to be positively correlated with nearest neighbors’ average happiness scores one, two and three links away, and that these correlations decrease with increasing path length. This work provides evidence of a social sub-network structure within Twitter and raises several methodological points of interest with regard to social network constructions.
4/6/12 - Paul Beliveau, Interactive robot construction and controller evolution in simulation
A robot's morphology affects not only its capabilities, but also its evolvability. Here we introduce an evolutionary robotics platform that will enable us to investigate the ability of non-roboticists to collectively explore the design space of evolvable robot body plans. Users create robots in a simulator using an interactive interface, then let evolution find a controller that enables efficient locomotion. The user can change the morphology of the robot and test the new design iteratively, altering the design based on the assessment of the robot's performance after a period of evolution of the controller. We investigate whether there is a correlation between the methods users choose to build robots and the evolvability of the robots.
3/30/12 - Eduardo Cotilla-Sanchez, Big data and power grids, learning from the former in order to sustain the future of the latter
The steady increase of renewable sources of energy being incorporated into electric power infrastructure, together with the market deregulation during recent years, put pressure on “The Grid” to consistently function near high-risk operation points. The fact that the frequency of large blackouts is not decreasing constitutes evidence for this criticality. Therefore, even though one of the objectives of engineers and policy makers is to achieve sustainability and mitigate climate change, we may be compromising another important factor, the security of the system. In this talk, we will discuss methods to assess the vulnerability of electric power infrastructure. In particular, we will evaluate the risk of high-impact, low-probability events. In order to attain this goal in an efficient and timely manner regarding system operator response, we will present approaches that leverage information from big data sources (such as synchrophasor time series) while exploiting the parallel nature of modern computer architectures.
3/16/12 - Narine Manukyan,
We introduce a new method for exploratory analysis of large data sets with time-varying features, where the aim is to automatically discover novel relationships between features (over some time period) that are predictive of any of a number of time-varying outcomes (over some other time period). Using a genetic algorithm, we co-evolve (i) a subset of predictive features, (ii) which attribute will be predicted (iii) the time period over which to assess the predictive features, and (iv) the time period over which to assess the predicted attribute. After validating the method on 15 synthetic test problems, we used the approach for exploratory analysis of a large healthcare network data set. We discovered a strong association, with 100% sensitivity, between hospital participation in multi-institutional quality improvement collaboratives during or before 2002, and changes in the risk-adjusted rates of mortality and morbidity observed after a 1-2 year lag. The results provide indirect evidence that these quality improvement collaboratives may have had the desired effect of improving health care practices at participating hospitals. The proposed approach is a potentially powerful and general tool for exploratory analysis of a wide range of time-series data sets.
3/16/12 - Narine Manukyan,
A self-organizing map (SOM) is a self-organized projection of high dimensional data onto a typically two dimensional (2D) feature map, wherein vector similarity is implicitly translated into topological closeness in the 2D projection. However, when there are more neurons than input patterns, it can be challenging to interpret the results, due to diffuse cluster boundaries and limitations of current methods for displaying interneuron distances. In this Brief, we introduce a new Cluster Reinforcement (CR) phase for sparsely-matched SOMs. The CR phase amplifies within-cluster similarity in an unsupervised, datadriven manner. Discontinuities in the resulting map correspond to between-cluster distances and are stored in a boundary (B) matrix. We describe a new hierarchical visualization of cluster boundaries displayed directly on feature maps, which requires no further clustering beyond what was implicitly accomplished during self-organization in SOM training. We use a synthetic benchmark problem and previously published microbial com- munity profile data to demonstrate the benefits of the proposed methods.
2/17/12 - Josh Auerbach, On the relationship between environmental and morphological complexity in evolved robots
The principles of embodied cognition dictate that intelligent behavior must arise out of the coupled dynamics of an agent's brain, body, and environment. While the relationship between controllers and morphologies (brains and bodies) has been investigated, little is known about the interplay between morphological complexity and the complexity of a given task environment. It is hypothesized that the morphological complexity of a robot should increase commensurately with the complexity of its task environment. Here this hypothesis is tested by evolving robot morphologies in a simple environment and in more complex environments. More complex robots tend to evolve in the more complex environments lending support to this hypothesis. This suggests that gradually increasing the complexity of task environments may provide a principled approach to evolving more complex robots.
2/10/12 - Kameron Harris, Chaotic social contagion, from zombies to hipsters
Simple, binary state (on/off) dynamical processes on networks are a tractable modeling framework relevant to many spreading processes in social, biological, or physical systems. Example applications are to social contagion and marketing, percolation in materials, and infectious diseases (including zombies). These models are closely related to toy models of magnetism which are the canonical example for phase transitions, "emergence" in the language of complex systems. When nodes are allowed to deactivate after being active, the dynamics can become much crazier. Here I will present preliminary results on a model that exhibits macroscopic chaos, where the bifurcation parameters are related to network structure and node update synchronicity. Individual node response functions can be interpreted as a way of modeling "hipster"-ish adoption of fads.
2/3/12 - Shane Celis, Terminating atrial fibrillation: a comparison of evolved and human designed ablation sets
A cellular automaton model of heart tissue is described which captures salient features present in Atrial Fibrillation (AF), a common heart disorder. An evolutionary algorithm is used to search the space of ablation sets that will terminate AF. These evolved solutions allow us to pose the following questions: Given three proposed phenotype encodings, which performs best? We find that the performance differences of the encodings are not statistically significant. How do they compare with a human designed ablation set from a domain expert? With respect to the fitness function, the evolved solutions significantly outperform the human design. However, an extrapolation of the original human design that is less conservative performs as well as the evolved solutions on two important measures. Finally, we test this hypothesis: if a diseased patch of tissue is prone to initiate or aggravate AF, ablations are more effective there. For a diseased patch close to a boundary there was no significant difference in the number of ablations. For a diseased patch away from a boundary, there were significantly less ablations within the patch compared to a control. Neither finding supports our initial hypothesis. We discuss the significance and limitations of the results found and propose improvements for future work.
1/27/12 - Nick Allgaier, Empirical correction of a toy climate model
Improving the accuracy of forecast models for physical systems such as the atmosphere is a crucial ongoing effort. In this talk I'll discuss an empirical model correction procedure that effectively forces a forecast model to learn from its past mistakes. The application of the technique to a "toy" climate model will be detailed, whereby its effectiveness can be explored in a simple, but still realistic scenario, in which the model is structurally different (in dynamics, dimension, and parameterization) from the target system. Results suggest that the correction procedure is more effective for reducing error and prolonging forecast usefulness than tuning of model parameters. However, the cost of this increase in average forecast accuracy is the creation of substantial qualitative differences between the dynamics of the corrected model and the true system. A method to mitigate dynamical ramifications and further increase forecast accuracy will be presented.