The innovation-loving Information Technology (IT) sector is a natural candidate to be front-and-center on sustainability, and some high-profile decisions have certainly bolstered that image. Whether it’s HP’s aggressive emissions reduction targets or Yahoo’s sophisticated cooling technologies in its data centers, there’s no shortage of stories on the green innovations of IT powerhouses. Giants like Intel and Cisco are some of the largest purchasers of renewable energy in the country.
But with data consumption rising more than 10% every year and with most companies still hitched to dirty energy sources like coal, a full-fledged green transition in the IT sector is urgently needed.
Consider these statistics: electronic devices in the US make up 15% of a home’s energy consumption. That number is expected to triple by 2030, according to Greenpeace. Much of our digital carbon footprint comes from the manufacturing of the many devices we frequently buy and replace and the energy it takes to power those devices.
The US is also home to 40% of the world’s data centers. While data centers are typically much more efficient than data storage options on site (switching to the “cloud” will reduce most companies’ energy consumption), their energy needs may triple or quadruple in the next 10 years! Apple’s new iDataCenter in North Carolina, for example, requires as much energy as 80,000 average US homes, and the grid in that region is heavily reliant on coal power plants.
So, how can the IT sector become more sustainable, and even use its expertise to encourage sustainability in other fields?
Since manufacturing makes up a large portion of the computing footprint, using less (and less environmentally damaging) materials in the construction of those devices while inserting renewable energy into the supply chain can make a big impact. Users can also buy more efficient products and use good power management.
Training a new generation of IT professionals is one way to embed sustainability in the field. While organizations from colleges to corporations are implementing sustainable practices in their IT departments, there are few universities offering degrees or certification on the topic. An exception is Leeds Metropolitan University, one of the first schools in the world to offer a degree program in green computing.
The power sector produces the most carbon emissions of any industry and it’s here that the IT sector could really use its know-how for great benefit. Much energy is wasted simply because utilities don’t have the best tools to anticipate demand. IT, in the form of a smart grid, could cut out the loose ends in our energy system and eliminate the unpredictability and waste.
Digital technologies have changed our lives beyond what anyone might have imagined mere decades ago. Their rise happens to correspond with pressing demands on natural resources. Ensuring that IT is a solution to and not a source of our environmental problems is a task individuals and organizations at all levels must take on.
Learn more about what EarthShare member charities are doing to help green IT:
What is the Smart Grid? (Scientific American)
TrackBack URL for this entry:
Listed below are links to weblogs that reference Turning the Information Technology Industry Green: