The End of "Moore's" is Not the End of the World

In 1965, Gordon E. Moore, the co-founder of Intel, made a bold prediction that would shape the technology industry for years to come:

"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years."

Ignore the nerdy wordiness: Moore basically surmised that the number of components in an integrated circuit would double each year, thus lowering costs and boosting processing speeds at an exponential rate. It was a terrific stab at a predicting the future of a then infant industry, but it didn't prove completely accurate. Moore actually tweaked his prediction in 1975, changing the rate to a "doubling every two years" instead of every year. This updated forecast, now known as "Moore's Law" has stood-up remarkably well against the test of time, but as John Markoff notes in the New York Times, this trend may be coming to an end.

The problem is not that [researchers] cannot squeeze more transistors onto the chips -- they surely can -- but instead, like a city that cannot provide electricity for its entire streetlight system, that all those transistors could require too much power to run economically. They could overheat, too.

05a-scotty_low.jpg"I'm giving her all she's got, Captain!"


I can imagine the talk in the chip-maker meeting rooms. "OK, guys. We can't just add transistors anymore. It's time to think outside the wafer."

If chip-makers want to continue to boost speeds at the rates we're all used to, this is just what they will have to do. They already adapted once when they hit a minor speed-bump about six years ago. Chip-makers used to simply boost processor clock speeds, but they hit that apex around 3-4 gigahertz due to the chips overheating. They overcame this speed-bump by adding additional cores and improving architecture. This number disparity still prompts the technologically-ignorant  to wonder why the heck a 3.6 gigahertz Pentium 4 is slower than a 1.7 gigahertz Core i7. ("Don't worry, Dad, just buy it for me.")

Still, let's say that Moore's Law does come to a halt and the processor speed apocalypse does occur; will it really be that bad? Most of the activities that the average person performs on their computer devices are not processor intensive (email, internet, music, videos, etc.) so the end of Moore's Law will not inhibit those functions. Also, a slower rate of consumer technology upgrades would be certain to assuage those who are easily afflicted with buyer's remorse and would also mean that last year's technology might have a better re-sale value.

Lastly, reduced speed increases do not necessarily equate to a downturn for the economy, as is feared by some. Let us not forget the emerging markets in China and India; they're going to want their iPads and iPhones regardless of how much faster the next generation is over the previous one. And let's be honest, we'll want them, too.

Addicted to Our Smartphones

I consider myself a very "tech savvy" person, but I still draw the line when it comes to owning a phone that's smarter than me. However, for one-third of Americans, smartphones seem to have supplanted canines as "man's best friend," and new research is showing that this relationship can develop into a problematic form of addiction.

"Watching people who get their first smartphone, there's a very quick progression from having a basic phone you don't talk about to people who love their iPhone, name their phone and buy their phones outfits," said Lisa Merlo, director of psychotherapy training at the University of Florida.


For humans, smartphone addiction can manifest itself in many ways, such as:

  • Utilizing the phone to avoid human interaction (He's not talking to his mother; he's talking to his phone.)
  • Becoming so engrossed in the phone that everything else is completely tuned-out ("Dude. Dude! Helllllllo?")
  • Excessively using the phone at the expense of work productivity, personal health, or the well-being of others (Dnt Txt & Drv)
  • Becoming so reliant on the phone that one can't function without it (A telltale sign of addiction)
  • Spending more money on data plans than they can afford (the average wireless access bill for smartphone users is now $107 per month)
  • Using the phone before going to bed, thus increasing cognitive arousal and creating sleeping problems

Michelle Hackman, a recent high school graduate, won a $75,000 prize in Intel's Science Talent search by conducting research on teen attachment to smartphones. She found that when students were separated from their phones, they became understimulated - their heart rates were lower and they lacked the ability to entertain themselves. This is a sure sign of partial dependency.

Smartphone addiction may not simply be an anomaly; it might be a herald of a complete change in how humans interact socially. We are more connected to others than we have ever been, and smartphones have become key mediums through which we can interface with millions of people and a vast amount of information. But this digital connectivity may be creating personal disconnection. As people become more dependent on smartphones, will the devices act as a detriment to direct human interaction? A study on a grand-scale is certainly needed to address this increasingly pertinent question.

"Neuromusic" Gives New Meaning to "Formula" Music

The recording industry is locked in a constant search for the next hit song. They currently rely on tried and true methods such as hooks, sex appeal, formula structure, and catchy lyrics, but a future answer could be to create "neuromusic" - music designed for the brain.  As Kevin Randall reported yesterday for Fast Company, neuroscientists may have discovered how brainwaves can predict hit songs.

In 2006, Emory University researchers Gregory Berns and Sara Moore placed 32 adolescents in an MRI scanner and had them listen to a selection of short song clips downloaded from Myspace.com. According to Scientific American:

The scientists took scans of song-related activity in the children's brains, and had the children report how likable each song was. After identifying brain areas whose activity was correlated with song likability, the scientists patiently sat on the data for about 3 years.
After the three years had passed, Berns and Moore re-examined their data to find if certain areas of brain activity predicted whether or not a song would be successful.

For one area -- the  nucleus accumbens - the answer was yes. Though it certainly didn't distinguish between hits and duds with dead-on accuracy, more activity in the accumbens was loosely predictive of higher sales.
So there you go, recording industry. If you want to produce a chart-buster, the answer may be to create a tune that tingles the listener's nucleus accumbens. Musical artists are already giving this method a try.

At least one top10 hip-hop artist has hired neuromarketing firm MindSign to study "brain activation" elicited by different music video elements while listeners lay in an fMRI machine, MindSign cofounder Philip Carlsen tells Fast Company.

Who knows, perhaps "neuromusic" will one day supplant the current well-known formula for a hit song, "intro-verse-chorus-verse-chorus-bridge-chorus-chorus-end" (see Avril Lavigne's "What the Hell" for an unfairly catchy example). However, I don't see this transition occurring anytime soon, as there are still plenty of 80's chart-toppers for modern-day artists to easily remix and turn into #1 singles.

Cities Cause Animals to Adapt and Evolve

For its human residents, New York is a melting pot of multiculturalism. Millions of people with vastly different religions, customs, characteristics, and beliefs interact and intermingle. This mixing creates confrontation and adaptation among New York's human citizens, so why should this situation be any different for the "Big Apple's" animal citizens?

Due to "life in the big city," which includes variables such as the introduction of invasive species, human-caused pollution, and the city's rapid growth altering the native habitat, New York, and other metropolises like it, are becoming hotbeds for studying animal evolution and adaptation. Who needs a lush Amazon rainforest or the Galapagos Islands?

Cities have often been a focal point for the human introduction of invasive species. Metropolises have a "worldly" set of inhabitants, and these inhabitants often bring plants and animals from their native lands. As the New York Times reported, biologists have discovered ant species originating from all over the world on different street medians. These different varieties of ants interbreed and create hybrids. The medians almost seem like miniature islands in the Galapagos.

Animals also evolve to live with human pollution. In New York's Hudson River, scientists have found fish that have swiftly evolved to become resistant to PCB pollution. Biologists have also discovered worms that are nearly invulnerable to cadmium poisoning.

Animal adaptation to human pollution has also been documented in London. Before the industrial revolution, the overwhelming majority of peppered moths sported a light coloration that camouflaged them among the pastel-colored trees. However, as smoke streamed from stacks during the Industrial Revolution, these trees became blackened with soot. The less prominent black variety of peppered moth quickly flourished as it was camouflaged from predators among the blackened trees. At the same time, the lightly-colored variety of moth declined from being eaten. Despite wanton contamination of the environment, life found a way to survive.

With the bluster of human activity indicative of a thriving metropolis, urban animals have been widely noted to adapt their behaviors to the scene. This is best observed among crows, who have ingeniously devised ways to broaden their food sources using human technology. Birds are famously known for dropping nuts and bones from great heights to crack the hard outer shells so that they can reach the food inside, but urban birds in Japan have found an even better method. As David Attenborough reports in this video, the crows simply drop the previously impenetrable food into a street and let the cars crack it open.

It's clear that animals are adapting and evolving to life in the "big city." This presents a plethora of opportunities to study evolution and adaptation close to home, and may even make metropolises like New York just as valuable to biologists as the Galapagos Islands.
  
If New York and the Galapagos ran into each other, I envision their "street talk" going something like this:

New York: "Eh! Eh! Galapagos! Yeah, I'm talkin' to you. Your evidence for Darwin's Theory of Evolution has got nuthin' on mine."

Galapagos
: "Hey, New York! Kiss my endemic species."


This piece was inspired by an article posted to Real Clear Science on July 26th, 2011.

Video Games Learned From Skinner; Will Our Employers?

CONGRATULATIONS!!!!! You have just slain the Black Dragon of Towmak and have discovered an epic treasure!
DING! You have just reached level 24!
KA-BOOM! You successfully destroyed your objective and have attained a higher rank!

BEEP-BEEP-BEEP-BEEP Fifteen minutes till work. Ugggh.

Video games, for the most part, are extremely fun and fulfilling; our jobs, by and large, are not. In video games, achievements are directly rewarded and reinforced. At our work, we often do not receive direct rewards for a job well-done. But hey, at least we're earning money so that we can do things that we enjoy... like play video games.

The bottom line is that video game designers learned from Dr. Burrhus Frederic (B.F.) Skinner and our employers did not. B.F. Skinner, a National Medal of Science winner, pioneered research into behaviorism by studying methods of behavioral reinforcement. Through his studies with rats, Skinner discovered five prominent reinforcement "schedules," all of which are heavily utilized in the video games that we play, but are often lacking in the jobs that we perform. They are:

Continuous Reinforcement- In Skinner's lab, a rat was rewarded with food each time it pressed a bar. In Super Mario Brothers, you are rewarded with a coin and a delightful "ka-ching" sound each time you smack Mario's head on a hovering box.

Fixed Ratio- In Skinner's lab, a rat could be rewarded with food every nth amount of time. In World of Warcraft and other role-playing games, your character can "level-up," thus gaining power and receiving new abilities. As time goes on, the fixed ratio grows larger (often exponentially), thus making the game more challenging and the rewards of "leveling-up" that much greater of a reinforcement.

Variable Ratio Schedule- In Skinner's lab, a rat would be rewarded with food an average of 25% of the times that it pressed a bar, meaning that the number of presses would be variable before each reward. In many adventure games, such as Torchlight or Dungeons and Dragons, each enemy has a chance to drop a rare item every time you kill it. Any monster could grant a huge reward!

Fixed Interval- In Skinner's lab, a rat would be reinforced with food after a certain amount of time, regardless of presses. Many games offer achievements and rewards simply for playing a certain amount of time. You are reinforced for consistency.

Variable Interval- In Skinner's lab, a rat would be reinforced at a varying amount of time. This technique is occasionally used in online role playing games to keep people logging in. For example, a monster holding epic treasure may appear at random. Each time you log in to the game could be your opportunity to slay the beast and claim the prize!

leveling ding dwarf.jpgDing!


Each of these schedules of reinforcement has its upsides and downsides, which is why the most addicting video games (such as World of Warcraft) attempt to use all of them in some form. With reinforcement bombarding us at every turn, gamers keep on gaming.

The same can't always be said about most of our jobs. Reinforcements in the form of raises or benefits are often few and far between. Moreover, when they are received, it often seems as though they were granted for no discernible reason.

Employers take note: Learn from video games and B.F. Skinner. Attempt to adopt all or most of his schedules of reinforcement and you will create a substantially more fulfilling workplace.

Oh Me, Oh Methane: My Favorite Climate Change Horror Story

A number of possible cataclysmic outcomes of human-caused climate change have been suggested over the last few years. We've all been treated to previews of an ice age in Europe, abrupt rises in the sea level, and the end of agriculture as we know it. All of these scenarios could inflict massive death tolls, cause destruction beyond belief, and create millions of environmental refugees.  Despite their terrifying implications, none of the aforementioned predictions come close to my favorite: a global oceanic "burp."

I'm talking, of course, about methane; or more specifically, the potential of a global release of methane from the seafloors. Current estimates show there to be anywhere from 500 to 10,000 gigatons of carbon stored as methane on the ocean floor. Compare this stored amount to the 2008 atmospheric carbon level of 810 gigatons, and you can see how a release of even a fraction of this underwater methane could have a tremendous impact on our planet. High methane levels are already hypothesized to have caused the Triassic-Jurassic Extinction event, where up to 50% of all marine species became extinct.

methane-cow-1.jpgMove over, livestock. The ocean could release a lot more methane than you.
 

How do we know that a methane "burp" could happen in the future? A recent study by Micha Ruhl & colleagues published July 21 in Science has shown that it may have already happened in the past. As Brandon Keim reported for Wired:

Ruhl's team examined chemical traces left by dying plants on the shores of the Tethys Sea, a body of water that separated the ancient continents of Laurasia and Gondwana.
The researchers concentrated on changes to carbon isotopes, or subtly different elemental formations that betray whether carbon in plants came from carbon dioxide or methane. At 201.4 million years ago, in that narrow 20,000-year window of their updated end-Triassic cataclysm, they found a rise in CO2 followed by a tremendous spike of methane.

Ruhl believes that a release of carbon dioxide from volcanoes around 200 million years ago triggered a rise in global ocean and land temperatures, thus causing the release of methane from hydrates on the seafloor. Methane is a greenhouse gas 23 times more potent than carbon dioxide and its release would create a worldwide positive feedback loop. As methane is released, temperatures would rise faster, causing methane to be released more rapidly, causing temperatures to rise, etc. Presumably, this would occur until palm trees (indicative of a tropical climate) extend to the North Pole, very much like the Eocene period.

If this climate change "horror story" were to occur, the Earth will most certainly adapt as it has done in the past. The question posed to all humankind is: Will we adapt? Well, after we've dealt with the anticipated massive rise in the sea level, the billions of refugees, and the guilt from knowing that we caused the extinction of millions of species, maybe we can all take a trip to the tropical waters of the Arctic? I'm sure it would be quite picturesque.

The Navy Wants You... To Be Energy Efficient

NavyEnergy.jpgDespite the charged American political climate that now envelops the issue of global climate change, the United States Navy is now boldly moving forward with plans to reduce their massive consumption of energy and foreign oil through a project called Task Force Climate Change. The service branch views energy independence and climate change as matters of national security. Imagine that!

Because the Navy doesn't have the luxury of professing a political opinion, it is required to make rational decisions based on a plethora of objective information. Thus, through Task Force Climate Change, a branch of the Navy's Energy and Environmental Readiness Division, the Navy has made the following conclusions about global climate change after considering "a broad consensus of observational evidence and historical trends:"

  • The Arctic is losing sea ice.
  • Large land-fast ice sheets (particularly in Greenland and Antarctic) are steadily losing ice mass.
  • Global air and sea temperatures are warming.
  • Sea level is rising in certain areas.
  • Precipitation patterns are changing.
In response to Navy Scientists' carefully considered conclusions and out of a desire to promote the cause of energy security, Rear Admiral Phil Cullom has likened the current situation to our original war for independence.

"Today, we are fighting for our independence on a different front. That of foreign oil," he said in a video message.
 
The Navy also recognizes that it is currently fighting against itself in this new war for energy independence - the Pentagon spent $20 billion on energy costs last year, alone.

That is why the Navy is heavily investing in alternative fuel research, studying ways to more effectively produce, use, and store energy, and encouraging their suppliers to become more energy efficient with a Preferred Supplier Program.  They have also set an ambitious goal to reduce petroleum use in its commercial vehicle fleet by 50% by 2015.

These are laudable efforts, but I say, "Let's think bigger." For instance, I would love to see seawater-fueled fusion reactors propelling our naval vessels of the future. It's a seemingly perfect fit.

My musings aside, the Navy's encouraging stance on energy conservation is a praise-worthy call to action. In the aforementioned video message, Rear Admiral Phil Cullom sought to recruit all Americans to join in the Navy's effort:

"Clearly, the scientists and engineers are working hard, but their efforts alone will not be enough to win our nation's latest battle for independence. We must all be in the fight. We must all consciously make good choices about our energy consumption and about our approach to the environment."

As a start, I intend to support our troops by driving less and turning off the lights.

This article is in response to a Time Magazine piece posted to Real Clear Science on July 19, 2011.

Follow Up: Plastic Bioaccumulation in Ocean

On June 20th, the Newton Blog raised the issue that widespread oceanic plastic pollution could lead to a bioaccumulation of plastic up the marine food chain, potentially even some day ending up in our food. A new study from the Scripps Institute of Oceanography adds evidence to this disturbing notion.

Two graduate students with the Scripps Environmental Accumulation of Plastic Expedition, or SEAPLEX, found evidence of plastic waste in more than nine percent of the stomachs of fish collected during their voyage to the North Pacific Subtropical Gyre. Based on their evidence, authors Peter Davison and Rebecca Asch estimate that fish in the intermediate ocean depths of the North Pacific ingest plastic at a rate of roughly 12,000- to 24,000 tons per year.

The twenty-day SEAPLEX survey of the North Pacific Subtropical Gyre took specimens from varying depths and locations throughout the gyre and examined 141 fishes across 27 different species. The researchers found that plastic was present in the stomach contents of 9.2% of the fish dissected. The authors believe that this number may be low.


That is an underestimate of the true ingestion rate because a fish may regurgitate or pass a plastic item, or even die from eating it. We didn't measure those rates, so our nine percent figure is too low by an unknown amount," said Davison.


The study also discovered that most of the ingested plastic was broken down to pieces smaller "than a human fingernail." This supports Newton Blog's previous suggestion that plastic photodegradation could lead to ingestion by marine animals at the lower end of the food chain.

The vast majority of fish that were found to have ingested plastic particles were myctophids, more commonly known as lanternfish. Myctophids are generally small fish and serve as a major source of food for many larger marine animals, including salmon and tuna - fish that humans consume.

"[Myctophids] have an important role in the food chain because they connect plankton at the base of the food chain with higher levels. We have estimated the incid ence at which plastic is entering the food chain and I think there are potential impacts, but what those impacts are will take more research," said Asch.

The jury is still out on whether or not oceanic plastic bioaccumulation will one day pose significant health dangers to humans, but Scripp's plastic study certainly reinforces the claim that plastic is entering the marine food chain at an alarming rate.

Sad Music Can Make You Happy

With my music collection being what it is (dominated by rock and metal, with a smattering of less mainstream rap and classical), I've often fielded this question from friends: "Ross, your music is depressing; doesn't it make you sad?"
sad.jpg
To which I've always replied, "No, I actually listen to it because it makes me happy."

"Weird."

I thought it was strange, too. How could it be that sad music would actually make somebody less sad?

Well, thanks to David Huron, a distinguished professor of arts and humanities in the School of Music and the Center for Cognitive Science at the Ohio State University, I am now armed with an explanation.

The answer involves a hormone called prolactin, which is most commonly associated with lactation in women. However, in an interview with The National, Huron insists that prolactin also limits feelings of depression that result from "sad" experiences.

When a person is in a sad state, this hormone called prolactin is released and it has a consoling psychological effect. So if something happens like your pet dog dies, or you've lost job, or you've broken up with your boyfriend or girlfriend, and you feel this sad, grief state, the body will respond by releasing prolactin. In most cases, this produces a consoling or warming effect. It's as though Mother Nature has stepped in and said "We don't want the grief to get too exorbitant."
Huron believes that sad music can put a listener into a "sham state" of sadness. This causes prolactin to be released without the listener actually experiencing a sorrowful event. Thus, without real grief to counterbalance the hormone's effect, the listener receives a net positive "comforting" effect.

However, Huron has found that not everyone can feel this effect (My naysaying friends could be a part of the disadvantaged group that cannot.) One of the professor's previous studies discovered that individuals who score high on "openness" or "neuroticism" in personality tests are more inclined to listen to sad music, perhaps because they can feel the prolactin effect.

Now that I can explain to my friends why sad music makes me happy, I just need to convince them that I'm merely more "open," not neurotic. This point might be a harder sell...

Baboons & Humans: Being Beta is Not So Bad

A new study from Princeton research associate Laurence R. Gesquiere and his colleagues has shown that being an alpha male may not be quite as glamorous as it seems (if you're a baboon at least). Studying five different troops of wild baboons in Kenya over nine years, Gesquiere and his fellow researchers discovered that alpha baboons maintain very high stress levels - much more, in fact, than the beta males.

The stress, they suggested, was probably because of the demands of fighting off challengers and guarding access to fertile females. Beta males, who fought less and had considerably less mate guarding to do, had much lower stress levels. They had fewer mating opportunities than the alphas, but they did get some mating in, more than any lower-ranking males.
Now, some might say that comparing baboons to humans in matters of stress and social structure would be folly... but I'm going to compare them anyways. (If you're adverse to this idea, think about it this way: since 91% of human and baboon DNA is similar, feel free to assume that 91% of my comparisons will be legitimate.)

In baboon society, the alpha male is the top of the troop, with mating rights over all the females. What a great existence! Unfortunately, life isn't as sweet as it sounds, as the alpha must regularly fight to defend his mating rights and dominance. The battering takes its toll both mentally and physically. Not only is the alpha's body consistently tattered and torn, he also deals with chronically high stress levels. This poses a significant health risk.

Long-term stress levels are a different matter. "In the long term, you fall apart, or are subject to diseases," said Jeanne Altmann, an emeritus professor of ecology and evolutionary biology at Princeton, and senior author of the new report.

Meanwhile, the beta males only fight occasionally and don't have to guard as many mates. This leads to a lower-stress and a potentially longer and healthier life.

Can this scenario possibly ring true for humans, as well? Human "alphas" run corporations, flaunt their power and influence, and get all the girls. How can they be more stressed than the "betas" below them? Easily.

Think about it; while an alpha male might portray dominance and be sought-after by females, he's probably also targeted for toppling by other alpha males and even some betas. As you may have heard, power often breeds enemies.

An alpha human male may also be more likely to fall victim to crime or injury. For example, a robber or deviant (think a lower-ranking male baboon) might be more likely to target a Lexus than a Toyota. In addition, the alpha male may have to repeatedly defend his dominance through business competition, sports, or even fighting.

While an alpha male might be calling the police to report that his fancy car was stolen, repairing a tarnished reputation, or even receiving medical attention as a result of a fight, a beta male is probably living with less stress. His eight year old car goes unnoticed by thieves, and he may not be subject to or ascribe to the rigors of competition.

We humans are often focused on climbing ever higher on the social ladder, believing that reaching the next rung will improve our lives. But with Gesquiere's baboon study in mind, I invite you to consider that being beta might just be better.

Block Quotes: Gorman, James. "Baboon Study Shows Benefits for Nice Guys, Who Finish 2nd." The New York Times. 15 July 2011: A1

Wireless Power is the Future

All around us, we now have wireless phones, wireless internet, wireless charging, wireless speakers, wireless remotes, wireless controllers, etc. - the desire is apparent: we refuse to be tethered by our appliances. But we are not free yet... there is still a final obstacle that must be overcome.

Electrical cords: like snakes they slither over desks, around table legs, under workstations, and behind televisions. Their serpentine omnipresence is enough to drive Indiana Jones to the height of fear, and sufficient to take an ordinary consumer to the pinnacle of annoyance. Cords entwine our legs, tax our organizing skills, and get chewed by pets, only to be purchased again (at a massive mark-up price) out of sheer necessity. I say, "No more!"

Thankfully, a team of MIT scientists led by Marin Soljačić are well on the way to curing our woeful need to plug in our electrical devices. In 2006, they successfully transferred power wirelessly from a "resonant source" to a "resonant capture device" attached to a 60 watt light bulb over seven feet away. The process, which they call "magnetic resonance coupling," sounds straight out of Star Trek. In 2007, they took this technology and formed a new company, WiTricity. The MIT team has been working to rid the world of wires ever since.

1.0_img_globe_graphic_sm.jpg

According to WiTricity, their wireless power transmission technology holds numerous benefits:

  • It has been shown to be up to 95% efficient for many applications, depending upon the distance.
  • Electricity can be transferred through common building materials, humans, and even around metals that might block the magnetic fields
  • It is completely safe to animals and humans
  • It can be scalable from milliwatts to kilowatts
  • It eliminates the need for all those darned power cords!
  • The power sources only transfer electricity when a device or appliance calls for it. No wasted energy!
  • The power sources can be placed in almost any OEM application

Think about it. In the future, there might be only one power source embedded in a structure; no outlets and no cords. You could place electrical devices wherever you want. Your computer workstation and your entertainment center would no longer be a certifiable fire hazard. Your cellphone would never die or annoy you with cries of "low battery" like a chirping bird squawking for more worms.

It is perhaps because of this utopian vision that companies are clamoring to partner with WiTricity. This year alone, the company has announced partnerships with six other firms, including Toyota. In addition, on June 23rd, WiTricity was one of ten companies to receive a General Electric "Ecomagination" Award, accompanied by an undisclosed investment prize from a $63 million dollar pot. The future for wireless power looks bright, indeed.

Tomorrow I shall venture to Best Buy to replace yet another pet-mutilated iPod charging cable. I invite you to join me in declaring, "Never again!"  Here's to a wirelessly-powered future.

Male Soccer Players More Likely to Fake Injury

If you were lucky enough to view the United States pull off an amazing comeback against Brazil in the FIFA Women's World Cup on Sunday, you also had a chance to view a medical miracle. In the 115th minute, Brazilian defender Erika Christiano Dos Santos collapsed in a heap in front of the Brazilian goal, apparently seriously injured. With the United States down 2-1, play was stopped for 90 crucial seconds as the clock continued to tick down to the USA's first quarterfinal loss in the history of the Women's World Cup. In that time, Erika was attended to by FIFA medical staff and was eventually carried off the field by stretcher.

Yet, as soon as Erika was taken off the field, an apparent miracle occurred. She sat up, unhitched her bindings to the stretcher, and jogged back to her sideline apparently completely healed.  However, when the boos and whistles began to rain down upon her in torrents, a limp suddenly re-entered her stride.

strangest-fake-injury-ever.jpg
As you are undoubtedly aware, Erika' s "injury" was likely not an injury at all, but a peculiar facet of the game of soccer called "simulating" - an act more popularly known as "flopping." This technique, whereby a player will fake an injury to deliberately waste time or to draw the attention of the referee, has been a major fixture of the men's game for a long time, and it has recently become more prominent in the women's game, with Erika's "resurrection" serving as a prime example.

Researchers at Wake Forest University recently took it upon themselves to study injury simulation in the women's game. After examining broadcast recordings of 47 matches from two different tournaments, they found that of the 5.74 injuries that occurred per game, 0.78 injuries were "definite" (whereby a player withdrew from participation in a game within 5 minutes or if bleeding was visible) and 4.96 injuries were "questionable" (the remaining incidents). This amount of questionable injuries seems high, but it is still significantly less than the men's game, where definite injuries made up only 7.2 percent of all injuries, versus 13.7 percent for women.

While the Academy Awards could have discovered numerous nominees for "best female performance" in the epic USA-Brazil quarterfinal match on Sunday, the men still have the overall edge in the acting category, and a huge deficit in the sportsmanship category.

Criticism of 'Questionable' NSF Projects Misguided

In April, Senator Tom Coburn released a report examining wasteful spending at the National Science Foundation. His report attempted to highlight incidents of "mismanagement of taxpayer funds," acts of "cheating taxpayers out of scientific funds," and wasteful "duplication" of programs between different sections of government. Most of the report, however, was dedicated to showcasing forty-eight "questionable NSF projects" (projects that received National Science Foundation funding).

Some of the "questionable" projects:

  • "Impaired Metabolism and Performance in Crustaceans Exposed to Bacteria." A study looking at how long shrimp can run on a treadmill.
  • "Ticket to Ride: When to buy or not to buy." A study exploring when to buy a ticket to a sold-out sporting event to receive the best deal.
  • "SETI." The Search for Extra-Terrestrial Intelligence.
  • Numerous projects studying personal interactions in virtual worlds and online video games.
  • Various social studies about politics.

Admittedly, some of the projects do seem wasteful. Particularly the ones whose findings would only seem to benefit politicians, such as a project which asked "how successful are party leaders at mobilizing support for party programs?"

Despite this, we should be hesitant to denounce specific studies. After all, plenty of seemingly "questionable" projects have accidentally led to some of the most important scientific discoveries of all time.


In 1791 Luigi Galvani was an anatomist at the University of Bologna. Galvani was investigating the nerves in frog legs, and had threaded some legs on copper wire hanging from a balcony in such a way that a puff of wind caused the legs to touch the iron railing. A spark snapped and the legs jerked violently (even today, we speak of being "galvanized" into action). In one unintended step, Galvani had observed a closed electrical circuit, and related electricity to nerve impulses.

In 1879, Louis Pasteur inoculated some chickens with cholera bacteria. It was supposed to kill them, but Pasteur or one of his assistants had accidentally used a culture from an old jar and the chickens merely got sick and recovered. Later, Pasteur inoculated them again with a fresh culture that he knew to be virulent, and the chickens didn't even get sick. Chance had led him to discover the principle of vaccination for disease prevention.


Granted, these discoveries were made without the help of taxpayer dollars, but the point is that scientific studies don't have to appear relevant to be truly "groundbreaking." One simply cannot know which experiment will lead to a major discovery. The purpose of Science is to ask questions. And sometimes we ask them without knowing what kind of answer will be found, or even if an answer will be found.  

While it is important that the National Science Foundation not disperse funding to overtly frivolous projects (such as those that clearly benefit only special interests), we should not be so quick to label studies as wasteful just because their purpose is not clear. In the 1700s, many may have thought that Luigi Galvani was wasting his time cutting up frog legs, but thanks to him we now understand that electricity is related to nerve impulses. In the 1800s, some may have held the opinion that Louis Pasteur was wasting perfectly edible chickens, but thanks to him we now have the modern vaccination procedure. Just because a scientific study doesn't make sense to one politician, doesn't mean that it may not have a deeper, unforseen relevancy.
 
Who knows, maybe shrimps on treadmills could end up changing our lives forever?

shrimp-treadmill-e1306589230960.jpgRun, Shrimpy, run!


Block quotes from: Gedney, Larry. "Unexpected Scientific Studies Are Often the Most Important." Alaska Science Forum. University of Alaska-Fairbanks. 4 Nov 1985  






End of Shuttle Program Only a Temporary Setback

ap_nasa_space_shuttle_atlantis_lt_110709_wg.jpgWhile space shuttle launches are often momentous occasions, my viewing of last Friday's liftoff of Atlantis was a somewhat hollow experience because it heralded the end of an era of American spaceflight. Over forty years have passed since America won the race to space, and now it seems as though we are ceding that victory.

NASA is now focused on nurturing private companies as they construct their own space vehicles. It's strange to see the agency that landed a man on the Moon take such a backstage role, but budgets are tight, and the current attention of Congress doesn't seem to extend past partisan bickering.

Polls show that Americans are sad to see the end of the shuttle program. Despite the program's estimated $200 billion cost since 1981, over 63% of Americans according to a CBS News survey say that the space shuttle program was worthwhile. In addition, 48% of those polled were "disappointed" by the end of the program versus 16% who were "pleased" (33% did not care).

For those of us who are "disappointed," our reasons for that disappointment vary. To me, it seems that $200 billion since 1981 is a small price to pay for a program that dared us to dream and united us all. (Especially since we have spent over $1 trillion on war since 2001.) 

The space shuttle program wasn't perfect. There were mistakes and tragedies along the way. But there's nothing easy or routine about breaking the bond of gravity and venturing boldly into the great beyond.

Thanks to the shuttle program, the private entities taking up NASA's charge will begin with a plethora of lessons learned.  Now, it's their time to shine - and they better... they have big shoes to fill.

It is my hope that one day, as we gaze in amazement as an American steps onto the red sands of Mars for the first time, we will look back on Friday's final liftoff of Atlantis as only a minor setback.

Blind Doctors Have Much to Offer

Tim Cordes graduated as valedictorian of his Notre Dame class in 1998 with a degree in biochemistry. He knew he wanted to go to medical school, yet the road to his dream was far from guaranteed. Despite his enviable resumé, Cordes was rejected by eight different schools solely or partly because he was blind. Thankfully, not all of them were daunted by his condition, and the University of Wisconsin-Madison gave him a chance. That chance was all Cordes needed.

In order to accommodate Cordes, the university provided the help of "visual describers" and a machine that converted visual images into raised lines. Though these accommodations certainly helped, it was Cordes' hard work and dedication that truly propelled him. His devotion allowed him to place a tube in a patient's windpipe correctly on the very first try, a difficult feat for any student to accomplish.

Today, Tim Cordes holds an MD 050405_blind_doctor_bcol3p.grid-4x2.jpgand a PhD from the University of Wisconsin and works as a psychiatric physician at UW Hospital. He and his guide dog, Bella, are easy to spot. The positive effects that they have on both patients and colleagues are also overt.

Cordes' colleagues offer glowing reviews of his performance. Dean Krahn, chief of psychiatry at the William S. Middleton Memorial Veterans Hospital, says that Cordes has discovered potentially life-threatening blood clots that others had missed, using only his sense of touch. One of Cordes' interns says that Cordes can realize that a medication is creating side effects based solely on the sound of a patient's voice. All of his colleagues concur that despite his blindness, Cordes can truly "see" his patients. In a way, his condition has allowed him to connect to patients in a way that his able-bodied colleagues cannot.

Cordes' impressive abilities have likely developed because he was forced to compensate for his lack of vision with his other senses. Though studies have shown that blindness doesn't actually boost the other senses, blind people do learn to be more attentive to things like touch, taste, smell, and sound. Perhaps if we had more blind doctors, they might be able to teach their non-blind colleagues to be more aware of their other senses. This could produce accurate diagnosis in certain situations.

Experience has taught us that enhancing diversity only makes us stronger. The example of Tim Cordes suggests that those with impaired senses have much to offer the fields of medicine and science.


Sources: Smith, Susan. "Seeing Potential." On Wisconsin. Summer 2011: 37-39, 62