en_tech_talks session 01: the readout
Centre for Net Zero • 5 April 2022
Centre for Net Zero • 5 April 2022
How can machine learning speed up the energy transition and help us reach our net zero targets?
A few weeks ago, Centre for Net Zero invited Jack Kelly, co-Founder of Open Climate Fix and ex-DeepMind engineer, and Dr Ramit Debnath, former Gates Scholar and Fellow at the University of Cambridge, to join us at the launch of en_tech_talks to answer this question.
en_tech_talks is a new event series designed to bring together the brightest and most progressive players in energy and tech, to envision, co-design and develop the future energy system. Each session shines a light on leading energy and tech experts at the forefront of the decarbonisation movement.
We were delighted to be joined by a strong turnout of 75+ guests from across academia, policy and industry, with audience members travelling from outside of London to hear from our speakers and take part in the discussion. Jack and Ramit provided a deep dive into their respective areas of research: using machine learning to improve solar photovoltaic forecasting, and understanding the rise in climate misinformation on social media.
Lucy Yu, CEO of Centre for Net Zero, kicked off proceedings, inviting Jack to join the stage. Open Climate Fix (OCF) is half way through an 18-month project with National Grid ESO which looks at solar electricity nowcasting: forecasting power outputs for the next few hours. Forecasting solar PV power generation is typically hard to do, because power output moves up and down as clouds move around during the day. On a bad day, forecasts can be wrong by 2-3GW, which is significant. Why is it important we more accurately forecast solar PV generation? First, we know that for the grid to operate successfully, supply must match demand. Second, by improving solar electricity forecasts, we can reduce costs and carbon emissions. More certainty means that we require less fossil fuel reserve to balance the grid and less spinning-reserve costs.
So what lies at the heart of OCF’s pioneering approach? In essence, their work involves using machine learning and satellite imagery to improve forecasts of PV generation. Currently, we rely on numerical weather models for cloud forecasts, which have several limitations: they take hours to execute resulting in stale forecasts, they struggle to resolve clouds (whilst more accurately predicting wind speeds and temperature) and historically, precise irradiance forecasts – the amount of electromagnetic radiation received from the sun per unit area – haven’t been prioritised. This combination of challenges results in inaccurate forecasting.
Nowcasting uses machine learning techniques to take recent observations and predict a few hours into the future. OCF combines physical and statistical approaches with machine learning, which learns from historical patterns to predict the future. The time taken to train these models on petabytes of historical data is quite high, but the prediction time is relatively low. This is in contrast to hugely complex numerical weather models that run on supercomputers, which take both a long time to design the model as well as perform the prediction computation. The result is quicker and more accurate solar PV forecasting. Whilst it’s early days, OCF is already improving forecasts by 2.8x compared to National Grid’s existing models. Jack is confident that his team has only scratched the surface – in the future, OCF wants to feed into their model the benefits of numerical weather models and combine this with graph neural networks to replace supercomputers and achieve what normally takes hours in a fraction of a second. Their hope is that these more accurate, quicker models are used in the National Grid control room which reduce overall costs of grid balancing. You can find a link to his slides with more detail included here.
Next up, Dr Ramit Debnath joined the stage to discuss how we can use computational social science for energy and climate justice. He opened by highlighting the ‘big net-zero policy question’: how do we place people at the centre of emissions reduction action? Ramit has been addressing this by looking at how public discourse is propagated on social media since 2009. In particular, he’s interested in the space between big data (which tells us what is happening at scale) and thick data (which tells us why something is happening in context) to generate scenarios which are important for the net zero transition.
Ramit talked the audience through two pieces of research: public engagement on emissions reduction in the built environment, and agenda setting by the fossil fuel industry on social media – the latter of which is ongoing.
The first piece of research explored public engagement with emissions in the built environment. Interestingly, this showed higher levels of engagement over time with a concomitant rise in negative sentiments, as discussions pivoted from focusing on emissions reduction and energy efficiency towards issues of justice. For example, popular climate hashtags have evolved over time:
This demonstrates how topics change as new innovations or technologies are introduced, or new issues emerge. Ramit attributes the increasing share of negative sentiments to greater levels of climate sensitivity and awareness within the general public, evolving attitudes towards climate change, scepticism of policy interventions and the addition of Twitter users in the period. One crucial outcome for policymakers is to immediately raise the salience of these issues and develop corresponding communications strategies.
The second piece of research that was discussed focuses on climate misinformation. Ramit first explained the structure of climate misinformation: misinformation is misleading and can be shared without the intent to deceive, whereas disinformation is both misleading and shared with the intent to deceive. He talked us through the sharing of misinformation in different stages across social platforms and how this can create echo chambers. He also touched upon the influence of different human factors that play into their creation, such as ideologies, values and belief systems.
This research uses computational social science to analyse one million Tweets from 2014 – 2021, sourced from 8 top polluting fossil ﬁrms, 11 non-governmental organisations (NGOs) and 8 international governmental organisations (IGOS). The research is three-fold: it examines whether industry, IGOs and NGOs have distinct echo chambers on Twitter; it estimates the leading topics from IGOs and NGOs and their sensitivity to the industry’s propensity to drive the discourse; and it determines if market performance inﬂuences climate misinformation.
The headline takeaway? The high proﬁtability of fossil fuel companies encourages IGOs and NGOs to adopt narratives that focus on clean jobs and business opportunities instead of promoting fossil fuel divestment – suggesting that these entities are aware that economic concerns may constrain public demand for green policies.
In summary, Ramit’s work highlights how climate discourses have become more nuanced over time on social media; the importance of differentiating between greenwashing and climate action; the complex interplay between organisations and their online communications; and the power that social media holds as a tool for getting people to engage with climate issues. The winner of our twitter competition left Ramit’s keynote motivated by two research conclusions: “Tech and innovation drive the public narrative on climate issues” and “Policy action matters to the public”. You can find more information about his research here.
Keen to come along to the next en_tech_talks session? We’ll be hosting these events on a regular basis, so follow us on LinkedIn and Twitter to ensure you don’t miss out. We’re announcing the next date shortly and are on the lookout for great speakers too. If you/someone you know is interested in sharing an inspiring energy tech keynote with our audience, please fill out this form.