Analytics and Climate Change: A Critical Review of “Data to Decisions”


In his article “Data to Decisions: How Technology Can Solve a $1.2 Trillion Climate Change Problem” (World Economic Forum, February 2, 2024), Himanshu Gupta argues that analytics and climate data present a significant chance for businesses to manage risk and create value in a warming world. Referring to a CDP Global Supply Chain Report, he warns that suppliers could face around $1.26 trillion in potential revenue losses over five years. He suggests that data tools, including remote sensing, IoT sensors, AI, and machine learning climate models, can transform uncertainty into actionable decisions. By framing climate analytics as a strategic advantage instead of just an environmental duty, Gupta effectively grabs the attention of businesses.

The article's main strength is how it connects new technologies with real business uses. Gupta shows how companies can use analytics for supply chain planning, asset risk assessment, and production choices. For example, he points out an agribusiness using AI based weather simulations to change planting windows and a construction materials company that positions production facilities in anticipation of hurricanes. These examples make “climate data” concrete, bridging the gap between environmental science and business strategy.
However, the article has several weaknesses. Most importantly, the claim that climate data could unlock $1.2 trillion lacks clarity. Gupta offers no clear method or assumptions to back up this figure no breakdown by sector, time frames, or geographic focus. Without that context, the number seems more rhetorical than analytical. A convincing economic argument should at least briefly explain how that value was estimated or reference supporting data.
Another issue is Gupta’s optimism about technology. The article suggests that digital tools alone can drive change but pays little attention to real world challenges, such as high implementation costs, poor data governance, limited interoperability, and a lack of analytical skills. Many small businesses and public institutions cannot afford complex models or sensor networks. Overlooking these challenges risks oversimplifying the issue. A more balanced discussion would recognize the need for policy incentives, partnerships, and skill-building to make analytics available beyond large corporations.
Gupta also downplays uncertainty in climate models. Climate projections rely on assumptions and data that often struggle to account for unprecedented or extreme events. Treating analytic results as accurate forecasts can lead to misunderstandings. The article could have improved its credibility by emphasizing solid decision framework, those that explore multiple scenarios and prepare for uncertainty rather than suggesting data alone can “solve” climate risk.
Equity is another aspect that is missing. The companies best equipped to leverage climate analytics are usually wealthy multinationals. Smaller businesses, suppliers in developing regions, and vulnerable communities often lack access to these tools. Without fair data sharing and capacity building, the advantages of climate analytics may widen existing inequalities. Gupta's vision would be more convincing if it discussed how to democratize data access or help developing economies adopt these technologies.
Finally, while Gupta advocates for collaboration, he offers little guidance on the institutional or policy structures required to expand these efforts. Questions remain about how to incentivize firms to share data, what standards should regulate its use, and how regulators can guarantee data quality and fairness. Without addressing these issues, the article feels more aspirational than practical.
In conclusion, Gupta’s article makes a strong case for using analytics to address climate challenges. It effectively repositions climate data as both a risk management tool and a growth opportunity. However, the argument would be stronger with clearer methodology, a focus on barriers to adoption, recognition of model uncertainty, consideration of equity, and specific policy suggestions. Without these elements, the promised $1.2 trillion in value remains an inspiring vision but not a fully credible roadmap for change.


Source: Gupta, Himanshu. “Data to Decisions: How Technology Can Solve a $1.2 Trillion Climate Change Problem.” World Economic Forum, 2 Feb 2024. https://www.weforum.org/stories/2024/02/data-decisions-technology-climate-change-problem/






Robots Beneath the Waves: Monitoring the Ocean’s Changing Carbon Cycle

Miles beneath the ocean’s surface, hundreds of free-floating robots are quietly measuring how the Earth breathes and how marine heatwaves are altering that rhythm. The Global Ocean Biogeochemical (GO-BGC) Array, led by the Monterey Bay Aquarium Research Institute (MBARI), has deployed a global fleet of autonomous robots to better understand the ocean’s changing environment.

Traditional methods of studying the ocean’s biogeochemical structure using satellites, buoys, and ships are limited in depth and coverage. These new robotic floats fill that gap by continuously monitoring the ocean’s biological, chemical, and physical properties. Each float drifts at about 1,000 meters below the ocean’s surface for nine days, then descends to 2,000 meters before returning to the surface to relay data via satellite. This cycle repeats endlessly, collecting vital information on oxygen, pH, nitrate, suspended particles, chlorophyll, temperature, conductivity, and depth.

10-day BGC-Argo robot cycle for collecting and transmitting data

As MBARI senior scientist Ken Johnson explains, “Marine heatwaves cause changes in ecosystem structure—in the plankton and how they operate—and these shifts in carbon export and how the ocean sequesters carbon are changing the services the ocean provides to us.”. This revelation raises important questions and insights through the intersection of technology, climate change, and the ocean’s health.

COP30 Kicks Off: How Climate Technology Can Make a Difference

 


As COP30 begins in Belém, Brazil, the IO+ article, "How Climate Technology Can Make a Difference" offers a hopeful yet realistic view of the role innovation can play in addressing the climate crisis. The authors spotlight emerging technologies from the Netherlands, such as Paebbl's CO2 mineralization, Carbyon's direct-air-capture machines, and SeaO2's seawater carbon removal, as examples of how science and technology can advance the Paris agreement goals. While I agree that such innovation is a positive thing, I think the article leaves out the issue of technology not being able to succeed without human and more specifically political agreements.

"Climate resilience technology: An inflection point for new investment"

McKinsey & Company analyzes the climate resilience technology market and expects the growth market to be between 600 billion and 1 trillion by 2030, reflected by the rising demand for technology that can counter the effects of climate disasters. Understanding the importance of the market in the face of a growing climate, they outline the market need for significant action and a variety of investors. Now, with this understanding, McKinsey built a framework to help investors consider opportunities in this field. The authors focused on the framework to be about the need for adaptation in the climate-resilient technology market. The framework identifies ten technology categories as attractive investment opportunities, then creates subsets within these categories to refine the criteria and estimate the market size of each subset. This article directly ties to the theme of my Climate Technology course, where we use geospatial data and weather APIs to analyze real-world climate events. 

The key strength of the article is the identification of analytics and technology being the focus on building climate resilience. Developing the framework, the authors recognize their sole focus on private capital is “... only a part of what’s needed for full adaptation …” (McKinsey 2025). While the projection of $1 trillion growth in the market highlights the financial importance of climate resilience technology, it also reveals how the authors are treating resilience technology as an investment market. With the sole focus on private capital, it risks sidelining public and community-led adaptation efforts. Climate resilience requires the contribution of everyone's action and not just the focus on the market. Building climate resilience is a shared challenge that depends on public data and collective decision-making. For instance, the coast of Maryland is developing a project to use wind power mill technology to promote sustainable energy solutions for the growing energy demand. This project of this scale is not driven by profit incentive or private capital, but it relies on the federal and state government, public data sharing, and community advocates. The core of this project lies in the collective contribution of everyone, demonstrating that climate resilience cannot simply emerge from market forces. McKinsey's sole emphasis on the private capital investor skips over the other major roles that play into fully adapting climate resilience technology. 

The author’s analysis uses more than 200 adaptation technologies that were narrowed down to 49 priority technologies, illustrated in the figure. The chart shows that the largest investment opportunities are in resilient buildings, while the lower categories are projected to receive less funding. The narrowing of the technologies raises questions about what the criteria were used to define as a priority. If the criteria were defined based on the best financial return, then the process would remove other important community or nature-based solutions. This process could introduce a subtle bias of ignoring community-level adaptation and treating it as only an investment portfolio. Having a balanced approach to the criteria could ensure it serves the community and the market. Although the McKinsey article strongly conveys the urgency of climate resilience investment, it does not provide much of the technical detail of how the data analytics are performed. It does not explain or provide any details of how the framework is trained and the data collection process to ensure their analysis is accurate. In the absence of this information, the process cannot be considered transparent. Analytics can provide valuable insights into climate resilience technology, but it can risk only serving market interests. A balanced approach is needed to incorporate everyone’s actions in fully adapting to climate resilience.




Reference

"It’s Getting Harder to Figure Out Whether You Live in a Flood Zone or Not"

 

The article I chose to go with for my second blog is “It’s Getting Harder to Figure Out Whether You Live in a Flood Zone or Not.” from the Wall Street Journal. I thought this article relates perfectly to our class because it goes over the importance of having up to date weather data for forecasting future flood zones.

 

One of the main points of the article is trying to make is that outdated federal flood maps are leaving millions of U.S. homeowners unaware of their true flood risk. While FEMA’s maps list about 8 million properties as high-risk, the article includes a private model that estimates “another nearly 13 million properties outside the zones with the same level of flood risk.” Another thing they point out, which doesn’t help with forecasting flooding, is that the government’s maps are often 10 or more years old and rely on historical data.” In the article they detail this as significant because these outdated maps fail to reflect changing realities like heavier rainfall and urban drainage failure. An example they use in the article is from a Chicago homeowner who states, “’That is what they say: We are in a no-flood zone’” despite having repeated basement floods. Like we’ve discussed in class, and as they mention in the article, this makes it harder for homeowners to be insured when flooding happens, especially when they’re not considered to be in a flood zone. The article further points out that attempts to update or expand flood zones face resistance and that, according to a federal advisory council, “it can take six years for FEMA to nail down flood-zone boundaries when it should ideally take two.”

 

I ended up choosing this article because it perfectly relates with what we’ve been talking about in class. One of the ways I thought it related with our class is why we’re using tomorrow.io’s data. Like they said in the article, the current data we’re using to predict all of our weather data, not just for flooding, is outdated because it gather data from everywhere around the world and it isn’t as up to date as it is from tomorrow.io. If we were to update these models with this data today, it makes me question if the number of homeowners in flood zones would increase.

 

In class, we also went over how to graph these types of models. In relation to this article, it makes me wonder what categories these people would fall under for flood risk, like did for Ashville. For instance, would all of these people be at high risk? How many people are we missing that aren’t at high risk, but can be at risk in the future? All of these numbers can impact the lives of real people, and it’s sad to know that many of them aren’t even aware of it. Reading this article makes me wonder what other things American will be at risk to in the future once this forecasting data gets updated.


Jaiden Davey


Article: https://www.wsj.com/us-news/lood-zone-risk-maps-15da3709?mod=climate-environment_news_article_pos1