Friday, December 28, 2012

"Double Entry" by Jane Gleeson-White


Anyone interested in numbers and the history of math will enjoy "Double Entry: How the Merchants of Venice Created Modern Finance" by Jane Gleeson-White. It's a new history of the development of mathematics and accounting from the Renaissance forward, with a particular emphasis on the work of the mathematician Luca Pacioli. Gleeson-White is Australian, and brings a refreshing perspective to the convoluted politics of her story. The history of double-entry bookkeeping is necessarily entwined with the conversion from Roman to Arabic numerals and the development of the printing press. Pacioli documented and standardized bookkeeping, and gave the world a versatile tool. His timing was such that his description was published and distributed throughout Europe.

The first six chapters take us through the 19th century, when double-entry bookkeeping was adapted to take account of the developments of the industrial revolution (expenses of man-made components like rail ties are conceptually different from a good that is traded away, like a gallon of wine.) This part of the book is lucid and interesting, though it might have been enhanced by illustrations that match the glory of the cover illustration.

Gleeson-White loses her way a bit in the final four chapters, forgetting that she is talking about a tool - instead she reifies double-entry bookkeeping into the cause of many social ills. Her point that we do not account for the environmental costs of many of our actions is a good one, but that is not because of the tool. On accounting for environmental costs, today's New York Times carries an interesting article on Ireland's three-year history of carbon taxes. One short quote:
“We are not saints like those Scandinavians — we were lapping up fossil fuels, buying bigger cars and homes, very American,” said Eamon Ryan, who was Ireland’s energy minister from 2007 to 2011. “We just set up a price signal that raised significant revenue and changed behavior. Now, we’re smashing through the environmental targets we set for ourselves.”
By contrast, carbon taxes are viewed as politically toxic in the United States. Republican leaders in Congress have pledged to block any proposal for such a tax, and President Obama has not advocated one, although the idea has drawn support from economists of varying ideologies.
 Cover image via Amazon.com

Monday, December 24, 2012

Adam Davidson on statistics and narrative

If you haven't yet read Adam Davidson's article "God Save the British Economy" in yesterday's New York Times Magazine click on the link and read it. It's a very clear explanation of the British government's economic austerity policy and related criticism. From this blog's perspective it's also an explicit and important description of the use of statistics in forming narrative - the narrative that we use to make sense of the world.

Here's one example from the article:
Economics often appears to be an exercise in number-crunching, but it actually resembles storytelling more than mathematics. Before the members of the Monetary Policy Committee gather for their monthly meeting, they sit through a presentation from the Bank of England’s economic staff. The staff members take the most recent economic data — G.D.P. growth, the unemployment rate and more subtle details gathered from interviews with businesspeople throughout the country — and try to fashion it into a narrative. Does a sudden spike in new factory orders represent a fundamental shift, or is it just a preholiday blip? Do anecdotal reports of rising food prices herald a period of inflation, or is it the result of a cold snap? Which story feels truer?
I've often talked about the importance of thinking about the statistics we use, understanding the context and thinking about the cognitive choices we are making when we develop and interpret the statistics. In this story Davidson dissects a set of choices. Read it and think about it - and tell me what you think in the comments.

Friday, December 21, 2012

Year in revew, from Google


Google has released this cool video, showing how we searched over the course of the year. And there's more data here: searches, images, TV shows, events and more. Have fun.

I'll be posting intermittently and infrequently between now and January 2. Happy New Year, and thanks for reading.

Thursday, December 20, 2012

Identifying a poor work environment, and some things you can do about it

Here's a link to a useful article "Fight the Nine Symptoms of Corporate Decline" by Rosabeth Moss Kanter on the Harvard Business Review HBR Blog network (free when you register). Here are the warning signs Kanter identifies:
  • Communication decreases
  • Criticism and blame increase
  • Respect decreases
  • Isolation increases
  • Focus turns inward
  • Rifts widen and inequities grow
  • Aspirations diminish
  • Initiative decreases
  • Negativity spreads
Though the examples may be from private business, the symptoms can occur in not-for-profits too. In fact, I've worked in places with some or all of these symptoms, and she's right, they are a warning of troubled times. This is a good checklist to use to take the temperature of your office. But just because you see a warning sign, you're not necessarily on an extended downward slide. Kanter continues with some ways to shift a culture to more successful habits:

  • Keep communication open and information flowing. Foster widespread problem-solving dialogue. Face facts openly and honestly.
  • Emphasize personal responsibility. Refuse to listen to attacks on others and ask each person to take responsibility for his or her part of a problem.
  • Model respect for talent and achievements at every level. Offer frequent public thanks. Praise those who meet high standards while helping poor performers improve (or weeding them out if they don't).
  • Convene conversations across groups. Involve diverse cross-cutting teams in problem-solving.
  • Stress common purpose. Communicate inspiring goals larger than any individual or group. Find a grand challenge to unite people.
  • Work on reducing inequities and status differences. Require the privileged to mentor and help others. Spread extra resources to many groups, and encourage joint projects or shared service. Provide opportunities for learning and growth.
  • Raise aspirations. Use small wins to show the potential for bigger successes. Encourage realistic stretch goals and offer people the help to reach them.
  • Reward initiative. Provide time or small grants to work on new ideas. Make brainstorming a habit.
  • Reinforce the positive by saying and demonstrating that change is possible. Ignore the voices of negativity.
Common sense, yes. But hard to do unless you are really thinking about your environment.

Wednesday, December 19, 2012

Warm November weather


Weird December weather - not to mention October and November - when was the last time you saw a white Christmas? Climate Central is reporting that November 2012 was the 333d straight month when average temperatures exceeded the 20th century average. That's 27 years. And since 15 of those years were in the 20th century and included in the average, well, that's bad news.

Climate Central also says:
Much of the world saw warmer-than-average temperatures during November. Warmer-than-average weather affected Australia, the Central and Western U.S., northern Africa, far eastern Russia, and central Asia. The small European nation of Croatia was particularly mild during November, with temperatures ranging from 4.3°F to 7.9°F above average during the month.

Tuesday, December 18, 2012

Neuroscience research

Yesterday Columbia University announced a very large gift establishing the endowment for the Zuckerman Mind Brain Behavior Institute at Columbia. In addition to its timeliness it's an exciting effort to bring together work in neuroscience, decision-making, imaging with the humanities. The institute will build on the work of Nobel Laureates Eric Kandel and Richard Axel, among others. If you haven't Kandel's books "In Search of Memory" and "The Age of Insight" (my review of the latter in the Brooklyn Bugle is here) I highly recommend them. And here's an earlier post of mine with links to some interesting articles about the science of decision-making.

Columbia held a forum on interdisciplinary neuroscience  in conjunction with the announcement. From the discussion it's clear that the institute is still taking shape. The panelists mostly described the contributions the various disciplines will bring, and they expressed some hopes for useful research in the first decade.

Richard Axel spoke more philosophically (and I am paraphrasing). Axel said that the brain is the most complex structure in the universe, and we don't understand it. We have learned that an individual alone cannot understand the brain - and plenty of individuals in disciplines ranging from philosophy to biology have tried. Moreover, the mind doesn't lend itself to verbal description. The institute is being formed to find new ways to address the problem.

The mind is particularly elusive, Axel went on, because neurons (nerve cells) are not like liver cells or heart cells, where genes control the behavior. Neurons themselves do not control behavior either; that is the job of neurosystems. Neurosystems are large and complex, with trillions of connections. The brain itself abstracts each system - and translates higher order notions into the firing of neurons. It's like abstract art, he said. And the task is to understand the meaning of this abstraction.

It will be interesting to see, in 30 years, what the institute looks like. Meanwhile, you can read Columbia's press release and watch a short video about the institute here.


Monday, December 17, 2012

Some gun data


The Guardian has posted an interactive US Gun Crime Map on its data blog. The screenshot above shows the percent change in firearms murders per 100,000 population between 2010 and 2011. (Note that no data are available for Alabama and Florida.)


You can also see firearms murders as a percent of all murders, firearms murders per 100,000 population in 2010 and 2011, and firearms assaults and robberies per 100,000 population. It's interesting.


Saturday, December 15, 2012

One number

and it is too large: 27.

Please go to whitehouse.gov and sign this petition. And send a letter to the President: today is the right day to start the conversation about gun control, nationwide.

Update, December 17: For a good look at the discussion and where it can start, see this James Fallows blog post and related links. Further update: And if you need some definitions, Slate provides them here.

Friday, December 14, 2012

Global mortality rates and causes, visually

There have been a lot of news reports about the Global Burden of Disease Study published yesterday in The Lancet. The Guardian's data blog has an interactive graphic showing cause of death visually. Here is a screenshot showing the 2010 causes (by percentage) of death among women by age group.

And a screenshot of the same data by region:

You can compare 1990 to 2010, look at rates for men, women or both, look at rate, number or percentage. And to there are tables comparing 1990 with 2010. Fascinating, and very well done.

Thursday, December 13, 2012

Understanding more about Bayesian analysis



Since I finished reading Nate Silver's book "The Signal and the Noise" (you can read my review of it here) I've been trying to find a way to describe the difference between Bayesian and standard statistics.

As I understand it standard, or frequentist statistics, the kind we were taught in school, asks the question: given a set of data, what is the frequency that a particular phenomenon will occur? According to Silver, this way of looking at a question means that we are thinking hard about the accuracy of our measurement (but assuming that we are measuring what we want to measure).

Bayesian statistics, on the other hand, asks the question: given a certain outcome or set of data, what is the most likely cause (or causal chain) for that outcome? Again according to Silver, Bayesian statistics allow us to think about how certain we are we know something.

Here's a relatively simple explanation of the math. 

 
And here is a more complex one:



The power comes from the ability to vary the different scenarios. Using a Monte Carlo simulation the analyst builds a model but substitutes a range of values for any factor that is uncertain. That's what Nate Silver does in the fivethirtyeight.com analysis for his blog, as you can see when you read his methodology. (You can read Jim Manzi's book 'Uncontrolled' for a look at the same thing using big data.) Acknowledging and accounting for uncertainty means that you get better results in the long run - as in the submarine search example in the first video above.  

Why weren't we taught it? Two reasons. First, running these simulations (Silver talks about running 10,000 a day, and that was in 2008) takes a lot of computing power, power that has only recently become available. Second, because Bayesian analysis starts with what we think we know, with a greater or lesser degree of certainty, some philosophers of science have argued, for various reasons, that Bayesian analysis failed to take account of the problem of induction: ie, that the only true knowledge comes from deduction. (I admit I am way oversimplifying here.) (Silver has a very interesting chapter on his discussions with Donald Rumsfeld about unknown unknowns). This view is now being rebutted. If you are interested, there's a good and reasonably accessible paper, "Philosophy and the practice of Bayesian statistics" written by Andrew Gelman and Cosma Shalizi available here.


Wednesday, December 12, 2012

Arctic warming is accelerating




Last week NOAA released its Arctic Report Card: Update for 2012. The news is generally bad.
In 2012:
  • 97% of the Greenland ice sheet's surface melted. In four days.
  • the Arctic sea ice pack melted at alarming rates this summer. See also here.
  • the land and ocean surfaces are darker than normal - which means they are absorbing more sunlight and warming faster than normal.
As the journal Nature quotes one of the report's editors on its website:
The darkening of the surface creates a positive feedback that explains why the Arctic is warming twice as quickly as lower latitudes . . . This is what we call the Arctic amplification of global warming, a phenomenon that was predicted 30 years ago, which we’re now seeing happening in a significant way.
 And arctic fox and lemmings (no symbolism there) populations are dropping.

Tuesday, December 11, 2012

Tree decorations by the numbers

Just in time for Christmas, a group of math students at the University of Sheffield, in the UK, has calculated the number of decorations, the height of the star, and the length of tinsel and lights you need for the perfect Christmas tree. And they've posted a plug-in formula so you can avoid the math.

I'm not sure what "perfect" means in this context (the screenshot is a still from the Debenham's TV ad, which you can watch here). The formula is based on the height of the tree and defaults to 140 centimeters, which is about 4.5 feet.

For a 4.5 foot tree, they say, you need 29 baubles, a 5.5 inch star, and about 14.5 feet of lights.

For a bigger tree, say 8 feet (my converter says that's 244 cm) you'd need 50 baubles, a 9.5 inch star, and about 25 feet of lights.

Trying this out makes me see some ratios - the star has been set at about 10% of the height of the tree. What happens when you try out the formula? Any thoughts on why height, not surface area or volume, is the basis? Have you ever not put all your ornaments on the tree? Have you even counted them? Send pictures!

Monday, December 10, 2012

NASA's Grail project maps the moon's gravity field



Update, December 14: Here's a link to the NY Times story about the crash of the two satellites into the dark side of the moon, projected for Monday.

Ebb and Flow, two satellites that have been orbiting the moon collecting data about the moon's gravity field, have sent back enough data for NASA to release this video of the gravity field map.

As NASA describes it, the map shows:
an abundance of features never before seen in detail, such as tectonic structures, volcanic landforms, basin rings, crater central peaks and numerous simple, bowl-shaped craters. Data also show the moon's gravity field is unlike that of any terrestrial planet in our solar system.
Why is this important? The moon's surface preserves the record of impacts from other bodies (overgrown or underwater or fractured on Earth). There will be more data and more reports, until the two satellites crash later this month. Read more here.

You can see a map of the earth's gravitational field (it's not round) here.

Friday, December 7, 2012

Useful tip for the weekend

This one is from Jesus Diaz of Gizmodo:
There's a Google Mail feature you have to use. Seriously. You must. Because copying an entire chain of messages after your reply doesn't make any sense when people can scroll down to see all the messages, chained one after the other. What makes sense is to only provide the snippet that you are actually replying to. And that's why you need to do this:
1. Select the text you are replying to in Gmail.
2. Hit the reply button.
3. Boom! Only the selected text will be quoted. Reply at will.
I know. This is common in all mail programs, but most people don't know it exists in Gmail too. I discovered it by chance a long time ago, assuming it would work, but today I discovered that most people don't know about it.
So start using it, please. Pass it around and enjoy the love of your correspondents, who will be grateful forever for your neat replies.
And, in case you think we're all about numbers here, a link to the 10 most often looked-up words in 2012 (according to Merriam-Webster).

Thursday, December 6, 2012

Considering the Obvious

Duncan Watts is a principal researcher at Microsoft and former professor of sociology at Columbia who is interested in what we can learn about humans from our networking behavior. I'm looking forward to reading his book "Everything is Obvious* *Once You Know the Answer" about common sense and its, well, weaknesses. His work has implications for marketing, social science research, and social services.

One example is government - we think we can use common sense, Watts says, to solve large social problems.
The problem with common sense is not that it isn’t sensible, but that what is sensible turns out to depend on lots of other features of the situation. And in general, it’s impossible to know which of these many potential features are relevant until after the fact (a fundamental problem that philosophers and cognitive scientists call the “frame problem”).
Nevertheless, once we do know the answer, it is almost always possible to pick and choose from our wide selection of common-sense statements about the world to produce something that sounds likely to be true. And because we only ever have to account for one outcome at a time (because we can ignore the “counterfactuals,” things that could’ve happened, but didn’t), it is always possible to construct an account of what did happen that not only makes sense, but also sounds like a causal story.
...
Common sense, in other words, is extremely good at making the world seem sensible, quickly classifying believable information as old news, rejecting explanations that don’t coincide with experience, and ignoring counterfactuals. Viewed this way, common sense starts to seem less like a way to understand the world, than a way to survive without having to understand it.
 Here's another interesting Watts column, about making predictions.
At the end of the day, making the right prediction is just as important as getting the prediction right, but it is only at the end of the day that know which prediction was the right prediction.
If this sounds hopeless, it is -- but only if we aspire to a level of certainty about the future that is at odds with the fundamental randomness of the world.  If we acknowledge that randomness, there are still useful predictions we can make, just as poker players who count cards can make useful predictions without ever knowing with 100% confidence which particular card is going to show up next.

Wednesday, December 5, 2012

An interactive guide to energy use

The journal Nature has published an interactive guide to the world's energy use, available here.
You can find out which countries are using which resources (in 2011). There are some surprises. For example, I knew that mainland China and the US are large consumers of coal and oil energy. But they are also the largest consumers of hydro and renewable energy as well. I expect that's because the largest consumers of energy are going to be the largest consumers regardless of the source. If you think I'm wrong, please let me know via the comments.

You get a sense of the issue in these screenshots:


Coal:




Oil:
Hydro/Renewable


It's an interactive guide, and I selected a few countries to illustrate what I found. I tried to be reasonably representative of established and emerging economies while keeping the charts small enough to see. (Note - you will have to click through to the Nature page to interact with the data yourself.)


Tuesday, December 4, 2012

“The Cost Disease: Why Computers Get Cheaper and Health Care Doesn’t” by William J. Baumol and others


We’ve all read many articles recently about the increasing costs of important services, health care chief among them. The rising costs of education, social services, and the arts have rated hand-wringing as well. “The Cost Disease” provides a fascinating counter to this view. In the book, William J. Baumol, an economist at NYU, and his co-authors argue that the rising productivity of manufactured goods (because automation reduces the amount of labor required to produce them) offsets the rising costs of services (whose costs rise because the amount of labor required cannot be reduced). What’s more, he argues, we can afford them and should continue to pay for them.

The cost disease, Baumol argues, is the perception that because costs of services like health care are rising faster than the rate of inflation, they are being priced out of our reach. But this perception, he says, is wrong. The problem is two-fold. It’s not so much the costs themselves that bother us as it is the rate at which costs are increasing. When we look at an average increase in real costs, we forget that it’s an average – and that while some costs increase faster than the average, others decrease slower. The costs of providing education, health care, or arts like opera, dance and music require a lot of labor. And the people providing that labor need to be paid enough to live, and to keep them from moving to other jobs. Baumol notes that he first published the theory in the 1960s – and that the data of subsequent years have confirmed it. For example, he reports, the salaries of health care workers have barely kept up with inflation, and those of employees at colleges and universities did not.

So, Baumol argues, if we think about the economy overall and understand that the unevenness of productivity growth is the source of the perception we will be able to afford increasing costs, even as the services take over a larger section of the economy. (He views that as an effect, not a cause.) The cost of manufacturing will continue to decrease and we will continue to to innovate so the economy will continue to grow. He cautions that because the poor will continue to get poorer, we must make a choice to cut back on some manufacturing and invest in social goods. Baumol attaches some caveats to his prediction, among them the need for wise government policy-making, careful education of the public, and tackling some of the foremost problems we have already created: climate change, the easy availability of dangerous weapons, and, well, our own cupidity.

Baumol discusses cost disease in these contexts as well as in the context of global health care, and those chapters are very interesting. One point is important to note: in health care at least, the quality-adjusted productivity has increased; that is, we're getting more benefits from our care. But when, he says, we look at productivity not adjusted for quality the result more mixed. If we don't do look at productivity alone, we fail to think about how much money must be raised to purchase a product. If this sounds a lot like a cost-benefit analysis it is, but it's an analysis that includes the context of the services. And that exposes a paradox: as Baumol puts it, we want the improvements in health care but don't like the associated costs.

It's when Baumol discusses the hybrid sectors of the economy, such as Research and Development or social services that things get really interesting. In these sectors, the cost of equipment, such as computers to support the work quickly become negligible compared to the labor costs. But the work is heavily labor-dependent: you can't, for example, trust a computer algorithm to come up with the right combination of services, in the right order, to help a family enough to prevent a steep decline into violence or child neglect. This imbalance often leads to poor government-decision making in the name of cost-savings.

But there is cause for hope. In addition to recognizing the cost disease, there are some hybrid sectors of the economy that repay investment. Software and business process services are Baumol's prime examples, as each repays investment two and three times, once when the developing company puts them to work and again when their customers do. Another way of thinking about them is as inputs to other services. This reframing can - and should - be applied to social services as well. A prime example is Steve Rothschild, whose book "The Non Non-Profit" I reviewed here. Rothschild sets out "create economic value from social benefit" as a key value. Doing so is critical, because showing that services create taxpayers from people who otherwise might continue to receive government benefits indefinitely is a compelling argument about efficacy - and for future funding.

"The Cost Disease" is a well-written book, very clear even for non-economists. (If I have one quibble, it's that the small pages mean that the small charts can be very hard to read.) The book should be required reading for anyone interested in public policy.

Image via Amazon.com


Monday, December 3, 2012

The increasing heat of summer

Do summers appear to be getting warmer? You're right, they are. That video animates data showing that temperature extremes are become more frequent in the northern hemisphere. You can see another, equally frightening, video here. You can read the original article here.

Update: There's an amusing take on the unseasonably warm December from Philip Bump here.


Friday, November 30, 2012

Creating conditions for innovation

McKinsey Quarterly's (free after you register) interview with Brad Bird, Oscar-winning director of The Incredibles and Ratatouille, has some interesting insights into creating an atmosphere that brings out the creativity in people.

For example:

* Learning about what others do
The Quarterly: Is there anything else you’d highlight that contributes to creativity around here?
Brad Bird: One thing Pixar does—which is a knockoff of old-school, Walt-era 1940s Disney—is to have all kinds of optional classes. They call it “PU,” or Pixar University. If you work in lighting but you want to learn how to animate, there’s a class to show you animation. There are classes in story structure, in Photoshop, even in Krav Maga, the Israeli self-defense system. Pixar basically encourages people to learn outside of their areas, which makes them more complete. Sometimes, people even move from one area to another.
* Improving morale
Brad Bird: In my experience, the thing that has the most significant impact on a movie’s budget—but never shows up in a budget—is morale. If you have low morale, for every $1 you spend, you get about 25 cents of value. If you have high morale, for every $1 you spend, you get about $3 of value. Companies should pay much more attention to morale.
The context is making movies; the content is widely applicable - read the full interview!

Thursday, November 29, 2012

Improving hurricane intensity predictions

A bit of hurricane folklore has it that hurricanes have a dry side and a wet side - that is, whether you'll get more wind than rain when a hurricane passes through depends on which side of the center you're on. A new report from the NASA Jet Propulsion Lab points out that not only is this folklore true, it may to improve the ability to predict the intensity of hurricanes.
The researchers found the hurricanes that rapidly intensified tended to exist within a moister large-scale environment than weaker storms. The rapidly intensifying hurricanes had statistically significant higher relative-humidity levels in their environments than storms whose intensity was weakening or unchanged.
 . . .
The team found substantial differences in relative-humidity levels between storm quadrants. One factor may be the shape of the Atlantic basin. Hurricanes in the Atlantic usually travel to the west or northwest -- regions that are drier, climatologically-speaking, than from where the storms originated. This causes the front two quadrants of Atlantic hurricanes to be drier than their rear two quadrants.

A unique result the team found is that in their front-right quadrants, rapidly intensifying hurricanes tended to have sharply higher amounts of upper tropospheric moisture near their centers than they did farther from their centers. 

A previous post linking to some good explanations of why predicting hurricane intensity is so complex is here. NASA is "exploring collaborations" that will allow forecasters to incorporate relative humidity data into hurricane prediction system, so we may be able to see test data in the next few years.

Wednesday, November 28, 2012

High fructose corn syrup, diet, and studies

There's been a lot of news coverage in the past couple of days about a study, "High fructose corn syrup and diabetes prevalence: A global perspective" by Michael I. Goran and others published this month in the journal Global Public Health. See, for example, here and here. (The NY Times has posted a .pdf of the article.) The paper itself is worth reading for several reasons.
  • It's really interesting. Across the globe, the number of people with diabetes is increasing, from 153 million in 1980 to 347 million in 2008. Most of the increase is coming as "Western-style" diets, ie those with lots of processed foods, carbohydrates, and especially sugar, become more widely consumed. And, as the report says:
    A growing body of evidence supports the hypothesis that in addition to overall sugar intake, fructose is especially detrimental to metabolic health and risk for type 2 diabetes. This is of particular concern given the global changes that are occurring in the use of high fructose corn syrup (HFCS) in food and beverage production . . . (citations omitted)
    The data table shows that, while not all the higher-diabetes countries consume HFCS, all the countries that consume HFCS have a diabetes prevalence that is higher - 20% higher - than countries that do not use HFCS. Note that this is an ecological study, looking at populations, not at individuals, and does not infer causation.
  • It suggests a reason why increased consumption of HFCS can contribute to an increase in rates of diabetes, even though HFCS does not depend on insulin. Instead, HFCS is metabolized by the liver, and does not generate leptin (which makes you feel full). In addition, there is some evidence that fructose helps generate fat, particularly the bad fat around your waste. If you're heavier, you're more likely to develop diabetes. In fact, diabetes-management sites like this one recommend keeping an eye on how much HFCS you consume.
  • Nonetheless, it's important to remember that this study describes correlation, not causation. The New York Times was a little incautious when it quoted Marion Nestle as saying that the study's conclusion was "a stretch." She seems to think so too, and this discussion on her blog is worth reading. Some highlights:
As with all correlational studies, something else could be going on that causes HFCS, sugars of all types, and diabetes to increase.
 And, later on:
Yes, HFCS is sugar(s)—glucose and fructose.  So is table sugar (sucrose).

But the bottom line goes for both: Everyone would be better off eating less sugar(s).

Tuesday, November 27, 2012

Four Charts and a Video

in which The Atlantic identifies the impact climate change is having already.

Bonus picture:
That's a time-lapse picture of glacial ice melting in the Arctic, illustrating the Guardian's article about a new film, "Chasing Ice." More pictures available at the links.

Image via guardian.co.uk

Monday, November 26, 2012

Analyzing tweets in real time


The computer giant SGI has teamed up with scientists at the University of Illinois to create the Twitter Global Heartbeat, a "real-time combined population, tone and geographic analysis and heat map visualization of Tweets." That is, the Twitter Global Heartbeat analyzes 10% of Twitter's 500 million daily tweets as they are posted and analyzes the content, tone, and location. The project converts the data to a map, showing hot spots of positive and negative comments. The video above is an analysis of tweets as Hurricane Sandy approached the US coast and moved inland. (I found it thanks to Robert Wright's post in TheAtlantic.com)

It's a new service (my choice of videos was Sandy or the US Presidential election). There are also some snapshots and graphs. Here's an example of the former, picturing "Global Sentiment from Live Twitter Feed." (Red is negative, blue is positive.)

The snapshot is November 15 - I can see why the Mideast is such a hotspot of unhappiness, but what was happening in Indonesia then? If you know, let us know in the comments.

You can follow the Twitter Global Hotspot's twitter feed here.

Friday, November 16, 2012

Useful programs for organizing information

Like many other people, I am constantly on the lookout for useful organizing tools. Here are a couple to ponder, and play with, over the Thanksgiving holiday.

Evernote: Evernote's slogan is "Remember everything." It's a free program downloadable to most platforms (computers, tablets, phones) and operating systems. The main product is an application that allows you to create one or many notebooks. You can type in information, copy urls, add photos or drawings. You can share pages or notebooks with co-workers, organize notebooks, and add searchable tags. Content is also searchable.

Best of all, Evernote synchs across platforms, so if you update something in your phone, it will be updated in your computer as well. There's also a handy little plug-in, the web clipper, that lets you clip and copy content or urls. Oh, and you can format your notes as text or lists. I use it as I scan the web, and I've also moved all my recipes into it. It's nice not to be dependent on all those decomposing fragments of paper.

TheBrain: Graphics-minded users find the mind-mapping software TheBrain very helpful.
Here's how the website describes it:
TheBrain moves beyond linear folders and lists, letting you create a network of information organized the way you think about it. You don't have to force any idea or project into a single folder. With TheBrain you can connect things to anything else. TheBrain applies visualization to your information, creating a digital map 
I tend to be more of a word person than an images person, so have found Scapple, which I've just started playing with, to be very helpful as I think things through. It's available in a beta version from Literature and Latte.  Here's a description:
Scapple is a tool for getting early ideas down as quickly as possible and making connections between them. The main advantage of doing this in Scapple instead of on paper is that you don't run out of paper (the Scapple canvas expands to fit as many notes as you want to create), you can move notes around to make room for new ideas and connections, it's easy to delete and edit notes, and it's easy to export your notes into other applications when you know what you want to do with them. 
The beta version is free; the developers ask for comments and feedback. The final version will be very low cost.

So try these programs out, play with them over the holiday, and let me know what interesting uses you come up with for them. I'll be taking next week off from blogging, so will be back November 26. Enjoy the holiday.


Thursday, November 15, 2012

Political action on climate change . . . . maybe?

So we've just had a monster storm on the East Coast, and (re-)elected a president who is ready to take action on climate change. As Climate Central puts it, at his first press conference after the election
Obama reaffirmed his view that manmade emissions of greenhouse gases are contributing to global warming, and stated his intent to continue to take action to reduce greenhouse gas emissions. “I am a firm believer that climate change is real, that it is impacted by human behavior and carbon emissions,” he said, noting that, “we have an obligation to future generations to do something about it.”
Here's a short piece from Auden Schendler about how we can go about it. He outlines three approaches - the Charge of the Light Brigade, the Battle of Leye Gulf, and the Battle of Agincourt. Schendler's providing an interesting metaphor. I think his deeper point is well taken - feel free to discuss in the comments.

Image via telegraph.co.uk

Wednesday, November 14, 2012

If you haven't seen it, read this excellent column by Eduardo Porter, "Charity's Role in America, and its Limits," in today's New York Times. He argues that, while philanthropy in the US is strong, it is not the solution to various social problems. Here's one sample:
In fact, a small portion of philanthropic efforts are aimed at helping those who most need it. A study by Rob Reich, a professor of sociology at Stanford University, concluded that only a small share of charity redistributes income from the wealthy to the poor. A big chunk of the $40 billion donated last year to educational nonprofits went for new buildings and new programs at someone’s alma mater. Donations to schools in affluent school zones tend to help their own children, not those on the other side of the tracks. 

Using measurements - how to get started


Sometimes the hardest part of doing something new is getting started. Here are a couple of ways to find your way into the process of using outcome measures. You don't have to try them in the order I've listed them - sometimes it helps to approach a problem from several angles.

  • Think hard about what you do, and what you want to know about it. What is the end result of your services? If you provide education or tutoring, what sort of improvement do you expect to see? What does that improvement mean in the long run for the students you are serving?
  • Why are you looking at numbers in general, and this process in particular? For example, if an important function is backlogged, you can use numbers to tell that the backlog has been cleared. But then you can take a deeper look, using what you've learned to identify the root cause of the problem. The goal is to prevent it from recurring once you have cleaned it up.
  • What data systems do you have? How can you harness them to provide numbers? Bring in your IT and QI staff. But don't use a number just because you can measure it. 
  • Make sure your measure tells you what you think it does - don't measure something just because it's convenient. One example - it's very easy to measure the number of people who start a program. Maybe comparing that number to the number who complete the program will tell you something you need to know. But if it doesn't don't use it!
  •  Don't rely on a single measurement - you are likely to miss nuance. At the same time, don't try to measure too many things. And, as always with numbers, remember the context.

Here's a good description of an effective use of numbers, from McKinsey Quarterly's Report, "The Global Gender Agenda."
McKinsey’s more general work on transforming the performance of companies shows that those with a clear understanding of their starting point are more than twice as likely to succeed as those that are less well prepared. In a gender diversity context, this understanding means knowing the gender balance at every level of the organization; comprehending the numbers by level, function, business unit, and region; and then monitoring metrics such as pay levels, attrition rates, reasons women drop out, and the ratio between women promoted and women eligible for promotion.
Why go to this expense? Establishing the facts is the first step toward awareness, understanding, and dedication to improvement. Using a diagnostic tool, one company simulated how much hiring, promoting, and retaining of women it would require to increase the number of senior women managers. That approach helped it set an achievable and, just as important, sustainable target that would not compromise a highly meritocratic corporate culture. With an overall target—that 25 percent of managing directors and directors should be women by 2018—and a clear understanding that the bar for promotion could not be lowered, managers now look harder for high-potential women and start working with them earlier to develop that potential.
You can see my earlier post about that report here.

This is another in a series of occasional posts about developing and using outcome measures. You can see a previous post, with links to related posts, here.

Tuesday, November 13, 2012

The Signal and the Noise, by Nate Silver

I've been a fan of Nate Silver's work since the 2008 election when I, like perhaps many of you, obsessively checked his blog. I've always thought that his writing is clear and that he is transparent - to a point - about his methodology. So I was eager to read his very interesting book, "The Signal and the Noise."

What Silver sets out to do in this book is explore our ability to make predictions based on big data. Silver's main thesis is that we should be using Bayesian statistics to make and judge our predictions about the world. As Silver puts it,
The argument mad by Bayes and Price is not that the world is intrinsically probabilistic or uncertain . . . It is, rather, a statement . . . about how we learn about the universe: that we learn about it through approximation, getting closer and closer to the truth as we gather more evidence. [Italics in original.]
As Silver acknowledges, this approach is not the one we are taught in school (or in classes in the history and philosophy of science. For a review of that approach, read the first third or so of Jim Manzi's book "Uncontrolled." My review of "Uncontrolled" is here.) Instead, Silver argues, we use statistics that focus on our ability to measure events. We ask, given cause X, how likely is effect Y to occur? This approach raises lots of issues, such as separating cause from effect - we get mixed up a lot about the difference between correlation and causality. We mistake the approximation for reality. We forget we have prior beliefs, so allow our conclusions to be biased.

In contrast, Silver explains, the Bayesian approach is to regard events in a probabilistic way. We are limited in our ability to measure the universe, and Pierre-Simon Laplace, the mathematician who developed Bayes' theorem into a mathematical expression, found an equation to express this uncertainty. We state what we know, then make a prediction based on it. After we collect information about whether or not our prediction is correct, we revise the hypothesis. Probability, prediction,  scientific progress - Silver describes them as intimately connected. And then he makes a broader claim:
Science may have stumbled later when a different statistical paradigm, which de-emphasized the role of prediction and tried to recast uncertainty as resulting from the errors of our measurements rather than the imperfections in our judgments, came to dominate in the twentieth century.
Silver describes the use of Bayesian statistics (to greater or lesser rigor) in many contexts, including sports betting, politics, the stock market, earthquakes, the weather, chess, and terrorism. We are better at predictions in some of these contexts than we are in others, and he uses the chapters to illustrate various corollaries to his main theme. In his first chapter, on the 2008 financial meltdown, he identifies characteristics of failed predictions: the predictor focused on stories that describe the world we want, we ignore risks that are hard to measure, and our estimates are often cruder than we think they are. On the other hand, in a chapter about sports data, he makes a compelling case for the premise that a competent forecaster gets better with more information. Throughout, he urges us to remember that data are not abstractions but need to be understood in context.

This is not a how-to book, and it certainly left me with many questions. How do you test social programs using Bayesian analysis? But it is a very good starting point.

Image via amazon.com

Thursday, November 8, 2012

More post-Sandy commentary

This column by Bob Massie of the New Economics Institute describes a series of discussions he had back in 2000 - that's 12 years ago - about the possibility of developing cars powered by hydrogen fuel cells. The idea wasn't that far-fetched; GM unveiled the Hy-Wire concept car in 2002.

Of course, Massie recounts, there were problems with the idea, one of them being how to convert gas stations to supply the fuel. He asked someone from a petroleum company about the problem of fuel cell infrastructure. Here's what he says:
How many gas stations were there in the United States? I asked. About 150,000, he said. How much would it cost to convert a gas station to provide hydrogen? I continued. Anywhere from $500,000 to $1,000,000, he replied. And did every gas station have to be converted to provide adequate supply? No, he said, they anticipated that only about a third of existing stations would need to switch to provide adequate coverage.
I did the math. "So you are saying that the price of converting every gas station in the country at the maximum price would be about $150 billion?" I asked. Yes, he said. "And if the price were only $500,000 and we only did 1/3 of them, we could do the job for $25 billion?" Yes, he agreed. We came to the conclusion that a reasonable cost for the whole job would be about $50 billion.
That looks like a lot of money, but at less than 1/3 of 1% of a single year's GDP, it seemed a bargain price to change America's automotive fuel source so profoundly. The investment had the potential to completely revitalize America's auto industry, alter our fuel sources, and, because hydrogen fuel cell reforming emits less than half the carbon emissions of internal combustion engines, catapult us into a clean energy future. When the discussion came up in the press, however, political and business analysts ridiculed the idea that Congress would ever commit $50 billion to such a switch.
That $50 billion? It's about the cost of the cleanup from Hurricane Sandy, in today's dollars. The cost to convert would have been more than $50 billion, of course, but Massie has a pretty good point, I think. Do you agree?

Wednesday, November 7, 2012

Election predictions


In case you might have missed it, once again Nate Silver correctly predicted the outcome of the election. You can follow his thoughtful description of his model and regular updates on his blog.

I am reading Silver's book, "The Signal and the Noise," now, and will post a review sometime next week. In the meantime, here is a post describing Silver's outcome (and response) in various media. (That's a screenshot of graphic artist Christoph Niemann's take on it.) Basically,
Not a bad night for the math nerds. However, the truth—which Silver would readily admit—is that he didn't really "predict" anything. The math did ... and the math was based on polls, which are also based on math. He pulled them all together and came back with a number, which was very useful (and comforting to Democrats), but not magic. 
 And here is an Atlantic.com post showing how various pundits' predictions succeeded (or, in most cases, did not).

Tuesday, November 6, 2012

More from McKinsey on Women in the Workplace

Last spring I wrote a post about a new report from McKinsey about its research on the advancement of women in the workplace, particularly large corporations. The McKinsey Quarterly (free once you register) has now followed that with a further report that suggest approaches for increasing the number of women at the higher levels of corporations, government, and academia. As usual with McKinsey, the insights apply to not-for-profits as well.

The conclusion? Well, progress has been made, but structural problems remain:
Firmly entrenched barriers continue to hinder the progress of high-potential women: many of those who start out with high ambitions, for instance, leave for greener pastures, settle for less demanding staff roles, or simply opt out of the workforce. . . And everywhere we look, despite numerous gender diversity initiatives, too few women reach the executive committee, and too few boards have more than a token number of women.
So what is to be done? McKinsey offers four strategies for committed leaders to follow:

1. Treat gender diversity like any other strategic business initiative - ie, set a goal and monitor it regularly. Expect the process to take some time, possibly years. Keep asking about it.

2. Ask for--and talk about--the data - in particular, think about the points where women exit. Why do they leave then? How many women are in the pipeline? Oh, and hold everyone in senior management accountable for those numbers.

3. Establish a culture of sponsorship - everyone, men and women, should sponsor, mentor, support two or three future leaders. And the current CEO/ED should spend time with them on visits.

4. Raise awareness of what a diverse work environment looks like - talk about your efforts; celebrate and publicize success.

And if you're in the US and still reading on November 6, don't forget to vote!

Monday, November 5, 2012

The Northwestern Juvenile Project

Today's New York Times carries a description of the Northwestern Juvenile Project, a longitudinal study of the mental health and outcomes for delinquent youth. The study follows 1800 (now down to 1644) youth with the goal of understanding drug and alcohol abuse, mental disorders, violence, HIV incidence, and paths or barriers to service. The study is directed by Linda A. Teplin of the Department of Psychiatry and Behavioral Sciences.

The design of this study, with its longitudinal approach and emphasis on follow up, is unusual and, from the description in the paper, sounds very well thought through. Because they are expensive, long-term follow up studies are rarely done. The potential for useful information for policy makers and service providers is immense. I haven't yet had time to look at the papers, but the abstracts -- including this one, which concludes that most of the youth in detention had a history of physical abuse -- suggest that there is much to be learned.

You can read more about the study, and find links to the papers that have already come out of it, here.

Friday, November 2, 2012

Sandy's tail lights



This beautiful image displays the impact that Hurricane Sandy had on the skies late in the afternoon of October 30th in Huntsville, Alabama. Each of those solar effects is pretty unusual; to see them together is rarer still.

The picture sent me to the website spaceweather.com to get an explanation:
The apparition is almost certainly connected to hurricane Sandy. The core of the storm swept well north of Alabama, but Sandy's outer bands did pass over the area, leaving behind a thin haze of ice crystals in cirrus clouds. Sunlight shining through the crystals produced an unusually rich variety of ice halos.
 Click on the links to find out more about these phenomena.

Thursday, November 1, 2012

Understanding Sandy's storm surge


The impact of the storm surge from Sandy on New York City and New Jersey is now becoming clear: it was tremendous. The surge at the Battery reached more than 13 feet above mean low tide, and water flooded streets and subways stations and tunnels. The screenshot is from Climate Central's sea level change forecast map, showing what a 10-foot rise in sea level will look like in lower Manhattan.

Three different but related actions contribute to a storm surge:
  • Wind - which piles water up high
  • Waves - which push the water ahead faster than the water can drain back
  • Pressure - low pressure of a hurricane means that water typically is higher near the eye
You can find a very clear explanation of a storm surge, by Jeffrey Masters of Weather Underground, here.

Why is a storm surge so damaging? All that water is heavy, and carries a lot of force. As Masters explains it,
A cubic yard of sea water weighs 1,728 pounds--almost a ton. A one-foot deep storm surge can sweep your car off the road, and it is difficult to stand in a six-inch surge. Compounding the destructive power of the rushing water is the large amount of floating debris that typically accompanies the surge.
Here's a photograph of some of the local debris deposited by the storm surge:


And yes, while no particular event is due to global warming, it's clear that global warming made Sandy worse: warmer seas meant that the storm was stronger and generated a bigger surge, and higher seas meant there was no place for the water to go but inland. Here's a good explanation.

Update: The Center for Climate and Energy Solutions has produced a useful fact sheet.

And in case you don't believe that Mitt Romney really said that the federal government should get out of the post-disaster aid business, here's a link to the transcript.


Tuesday, October 30, 2012

Thinking about distractions

Having spent most of yesterday and half of Sunday thinking about, preparing for, and watching Hurricane Sandy pass through, it seems like a good time to report on this column from the Harvard Business Review Blog, "Three Ways to Think Deeply at Work." The author, David Rock, of the NeuroLeadership Institute, conducts research into the role of the unconscious mind in solving problems.

Rock identifies three techniques to help you think more deeply. These should help whether you work in an office or from home. Here they are:
  • Distraction: Experimental work shows that a brief distraction can help you solve a complex problem, one that is too big for the conscious mind to solve. (The experiment involved picking which car, each with 12 attributes, is the better buy.) The group given a brief distraction did better than the groups told to solve the problem immediately and those told to keep working at it. But keep the distraction short. Rock says it works because
stepping away from a problem and then coming back to it gives you a fresh perspective. The surprising part is how fast this effect kicked in — the third group only had two minutes of distraction time for their non-conscious to kick in. . . something  . . .  accessible to all of us every day, in many small ways.
  • Plan for time to do the tasks that require deep thinking, and time for the distractions. Here's what Rock suggests:
* Think about one question/idea that needs insight and keep this thought in your subconscious mind.
* Clear your conscious mind by using this two-step system: move your thought(s) from your mind to a list and then clear your list when you have a short break (if your meeting is canceled, for instance, or your flight is delayed).
* Plan your week and month by listing three priorities you would like to accomplish.
* Make certain you have at least four consecutive, uninterrupted hours a day dedicated to the three priorities you identified.
  • Understand that your mind can manage only so much at once. So, in addition, Rock recommends scheduling the tasks that require the most attention when your mind is fresh and alert, and "grouping ideas into chunks whenever you have too much information."
I'll admit I'm trying some of these. Do they work for you? Let me know.

Monday, October 29, 2012

Hurricanes since 1851

The screenshot above is a projection (notice the South Pole center) showing hurricanes since 1851. North America is to the top right, just to the left of the firework of hurricane paths. The graphic was produced by John Nelson on UXBlog. The brighter colors are the more intense storms.

Here's what else Nelson says:
A couple of things stood out to me about this data...

1) Structure.

Hurricanes clearly abhor the equator and fling themselves away from the warm waters of their birth as quickly as they can. . . . The void circling the image is the equator.  Hurricanes can never ever cross it.

2) Detection.

Detection has skyrocketed since satellite technology but mostly since we started logging storms in the eastern hemisphere.  Also the proportionality of storm severity looks to be getting more consistent year to year with the benefit of more data.
You can see an animated version on the Guardian data blog, here.

Friday, October 26, 2012

Tracking Sandy


If you're interested, that's a screen shot of the predicted path as of this morning of Hurricane Sandy. For various reasons meteorologists are getting better and better at predicting the paths of storms, though not the intensity at any one time. The possibility that Sandy may come into contact with a more typical North American weather system means that a big hybrid system may develop.
And, as Adam Sobel explains on Climate Central, that would result in an unpredictable storm.
As Sandy moves northward, it will move over cooler water. If this were all that were happening, Sandy would weaken, as tropical cyclones moving toward a pole typically do. At the same time, though, Sandy will come close enough to the upper trough now over the U.S. to interact with that trough in something like the way that an extratropical surface low normally would  . . .
When this happens, they will form a hybrid storm system with some tropical and some extratropical properties. Some energy will still come from the ocean surface, but some will now come from the pole-to-equator temperature contrast. This new energy source will enable Sandy to maintain its intensity, or maybe even increase it.
This process is called “extratropical transition.” It poses a lot of problems for forecasters. In the first place, the computer models aren’t that great at predicting exactly when it will happen. 
So keep an eye on the storm. You can do it easily through ESRI's social storm tracker:

Blog Archive

Popular Posts