17 October 2019 - 18 October 2019

Transforming Ocean Data

Chair: Dr Linwood Pendleton

In Partnership with the Ocean Data Foundation

Executive summary

The quantity of data about the global ocean is growing exponentially but it is not being best used to inform policy or to drive private sector innovation fast enough. This conference tackled two main issues: the first is how ocean science and the data it produces can be more effectively shared; and the second, how vast amounts of private sector data could potentially be opened-up for use in policy and decision making and to drive wider innovation including in communication for global public understanding of ocean change.

Commitments were given by philanthropic ocean research funders to prioritise data in cross-sector research planning and in innovative communication strategies.

Research NGOs promised to set standards and show leadership for data release.

Ocean scientists resolved to make greater use of data science; to understand their networks and draw in new partners, explore incentives for early data release and to work with parts of the private sector. 

Policy-makers committed to explore opportunities for data sharing with defence sectors and national militaries.

All were committed to consider how best the ocean data community can leverage its digital assets in the face of rapid technological change?

Change in the uses of ocean data is part of an on-going transformation in the use and value of data (public and private) in modern economies. Will ocean environmental data be drawn into the open-source movement with open public platforms or will ocean data be increasingly monetised as part business moves to define the economic value of natural services?


Integrating and sharing multiple heterogeneous data sets could lead to discoveries of information about the planet.


What’s the issue?

Rapid environmental change and increasing threats and damage to the ocean is happening at the same time as data about the ocean (from a whole range of sources: private industry, research, and from people and social media) is growing. The need to better understand changes in the ocean is urgent and large quantities of relevant data are potentially available to inform this understanding but a change in the mobilisation of ocean data is needed to make it discoverable and usable.

This Ditchley discussion, held in partnership with the Ocean Data Foundation and chaired by Linwood Pendleton, brought ocean scientists together with new technologists, data scientists, ocean NGOs, private industry, communication experts, entrepreneurs, tech innovators, and representatives from ocean governance and data platforms.


An increase in both the quantity and diversity of ocean data has been driven by the development and use of new technologies - not just for science but in ocean-based industries, transport, militaries and many other sources.  More sensors than ever are now being used, generating new data. There are new instruments for exploring and observing the sea such as gliders, ROVs, and subs that crawl the sea floor and new ways of imaging ocean ecosystems from drones, planes and cube satellites all of which create huge amounts of data. We can take near continuous images of the ocean’s surface and use hyperspectral imagery to see below the surface. We have cameras documenting fish and bycatch as it’s brought on board and petabytes of acoustic data are being generated. In some cases uses, of satellite imagery are already overtaking traditional survey methods.  

And yet much of this new data (and the troves of historic data) is not being effectively used to inform decision-making, to generate new services and better public understanding of the ocean.

The ocean makes up 71% of the earth’s surface and is a resource on which all human life depends. Although data on the ocean is growing, large tracts of the ocean remain unmapped and unknown and much of the data produced is messy, unused and as yet unusable. Managing the ocean requires evidence and data.  The task is to find answers to the technical and cultural challenges that will allow better access to data for those - decision-makers, researchers, businesses, entrepreneurs and citizens - who might innovate data driven solutions to prevent further damage, to restore ocean heath and increase public understanding. 

The United Nations made a call for a transformation in how we conduct ocean science for sustainable development when it declared the UN Decade of Ocean Science. The effective use of data has been the focus of discussions, for example by the Ocean Data Alliance and the World Economic Forum. The ocean research NGO Rev Ocean has highlighted these issues and helped to create the not-for-profit Ocean Data Foundation to create an Ocean Data Platform.  These initiatives signal moves towards the transformation of ocean data as part of the new digital economy.

Current Practice

Improvement or radical change? Data collection is a central feature of ocean science and there is much good practice. In the UK alone, existing services such as the Marine Environmental Data and Information Network provide access on an open platform to share UK marine data. The Partnership for Observation of the Global Ocean tool; http://ocean-partners.org and https://oceanscape.org are existing models. Organisations such as OceanWise, and the Plymouth Marine Centre work with others to pull data together to create new data. Information about ocean expertise is available with for example IODE’s OceanExpert. Maps and spatial data are powerful: the global seafloor mapping endeavor, Seabed 2030, for example is on track to map the ocean seafloor and includes crowdsourcing techniques, peer review for data and rewards young scientists for their work with data. 

Existing data sharing models work well in specific sectors. But academic cultures and publishing practices can create incentives for data hoarding. The academic/scientific publishing model currently slows data publication. Data attached to the academic papers must be releasable.  And, simply putting data on a website, as the means of making it public without further promotion, was acknowledged as inadequate.

Current data centre structures were considered outdated; the IPCC and other processes are too slow; the ocean observing infrastructure was described as insufficient and unable to support data sharing; and there are no international standards or methodologies. Scattered data was described as wasted data.

Data collection must be accelerated, for example to address climate-related disaster or to complete the missing information about the deep ocean water column.  (Systems like Deep Argo were seen as a priority).  Data release in key areas is too slow: such as those working on Arctic which is melting away faster than the data are coming out. It’s not just the data, it’s the time taken to translate data into knowledge. The generation of data is outpacing existing capacities to manage, use, make sense and find meaning. 

Managing the data flow. The resources needed for storing and curating data can be significant and drive questions about what should be prioritised. Is existing data, (the decades of carefully collected and curated time series data), used widely enough? Stranded and wasted data suggests a need for better data management. Silent data accrued as the by-product of other processes; data gathered and shared by citizens via social media and counter-factual data on what might or hasn’t happened, are all a part of the bigger data picture.  Could data management be more effectively organized by scientists focusing on collection and data scientists focusing on analysis? Is the ultimate goal to find ways to link data with open search engines, and making it easy to find?

Is more data necessarily better data? More data can potentially degrade decision-making. The lure of more data can lead to buck passing and decision-making delay. In many instances there is enough data to make good decisions. The tendency to want certainty inevitably leads to delay.  But questions remain: how long will data be useful for? How is it best used in management or conservation plans? Greater peer review of data collection proposals was proposed. For others, data glut is never a problem? Issues can be discovered via data exploration, visualization and experimentation. Machine learning and data science offers ways in.

Ethical and legal questions about the use of emergent tech to create new data. Could data collection and surveillance cross ethical lines? There was concern over mistrust of open-data and the impact that GDPR might have on data collection, in particular the distinction between personal and aggregate data. Regulation could have consequences (unintended) on how data can be shared.  Questions have been raised about the use of  observational satellites.

On the other hand, it was argued that not enough data on people in the ocean is collected. Global Ocean Observing Systems (GOOS) should include human data as part of ocean data and human uses of the ocean.

Effective story-telling requires data and evidence. There is much potential to mobilize people’s interests with data-based engagement. The ocean is better understood as a ‘global commons’ than ever before. Increasing data availability and co-production of data with citizens (to include local and indigenous knowledge) makes for more effective communication. Contributions of data, information and analytics can help motivate people to feel part of the solution

Private sector actors are generating very significant quantities of data. But companies are not sharing data as they might; some share data regionally, some keep data to themselves, others are concerned about how actors opposed to their industry might use data against them (and with good reason). Privately held industry data is just another form of ‘stranded data’ – underutilized and disconnected datasets. Insurance data and risk assessment exercises create further sources that could be integrated with public research data.

The bigger picture. Problems in the ocean were said to start on land; the challenge is to connect data from very different systems for a better picture of how humans are interacting with the ocean. The potential to make connections between, for example, ocean data and demographic, food production or economic data opened-up new possibilities.

The potential for industry data to provide insight into the second order effects of human activity was described as enormous. Acoustic fish tracking used by businesses for example, already indicates the ways fish movements can be better understood and such insight can be extended to the impacts of human activity. Use of drone captured imagery linked to machine learning has opened new areas for conservation and models exist in other areas for example, in the smart city movement. In summary, the opportunity in integrating multiple heterogeneous data sets is to discover new information about the planet.

Ocean science. The need for more effective data sharing within academic and publicly funded ocean research was recognised, even though examples of excellent practice currently exist. To deliver data sharing at the scale now needed, the cultures of research science require some change. If the global sustainable development goals are used more explicitly to direct research agendas, data sharing would follow as part of a mission to deliver sustainable development goals.  The current system of academic publishing slows down data release and therefore slows data use: publishing models require review. The drag on data-to-knowledge must be tackled.  Greater use of data analytics and data science within oceanographic science would help.

The data links between research science and private companies need to be made. A new generation of data scientists and data journalists will use open source tools and apps.  Scientists must consider use of their data for non-academic purposes so journalists, film-makers, entrepreneurs and others can use ocean data to communicate new insights and knowledge to publics around the world. Co-production of scientific data will be essential for countries such as those in the southern hemisphere without advanced science infrastructure, and co-production of data with citizens can help to raise the profile of the changes taking place in the ocean. 

Data Needs

What sorts of data architecture is required to bring together heterogeneous data and match data with new data needs? Can, for example, data keep pace with the needs driven by climate change? Is it possible to bring together old-school methods with modern autonomous systems? What other sectors could be involved, e.g. data finance?

A data economy was imagined as a system to allow co-design with users; generate data and link with data repositories and distributed networks to amplify access; encourage use of analytics – and include others (private sector); make data usable by following standards and frameworks built by technologists, data scientists and users; deploy social scientists who work with visualisers, gamifiers and communicators to support policy change, public understanding and behaviour change. In such a data economy, publics would be able to crowdsource compute power to drive change in public policy and in personal/public behaviour.

There are any number of new potential contributors and users of data. These include as examples, citizen science communities, app developers and gaming expertise, data journalists, food security interests, shipping and shipping traffic data, those working with refugees such as the UN High Commission for refugees or people working in new sectors such as marine genomics.

Opening up the ocean data community to new users included ideas for co-designing data collection and analytics. The Science for Nature and People Partnership was suggested as a model.  The creation of access rights (e.g. Creative Commons) and frameworks for comparable data and metadata were put forward, as was the use of national public data services such as Norway’s Agency for Public Management and eGovernment – seen as an exemplar for creating access.

Would facilitation services help to address and smooth out issues relating to data quality?  Should there be dedicated bridge organisations or watchdogs to act as conduits, curating data between physical oceanography, ecology/biodiversity, climate change and industry.  The Australian Urban Research Information Network (AURIN) has accelerated high-speed networks between 6 universities sharing data management, demographic trend and urban data.  Such services might include networked repositories that follow standards, organize analytics and provide initial results and brokering services so that access to raw data, (although essential) is not always necessary. The UK Plankton Recorder is a good example of an alliance of specialists that have been brought together by funders to pool data and understand the effects of environmental changes on plankton biodiversity at a global level. Research-based competition can lead to individualistic approaches, defending projects, labs and organisations and leading to a proliferation of portals. Could a federated system of portals build towards linked networks?

Data loss and nefarious actors could create risks for this new ocean data community. The EU, for example, made a backup of data.gov data after President Trump was elected. The invasion of Crimea led to significant quantities of lost data and fake climate denial data is actively circulated. These risks present challenges to open-data models.

Incentives can be built into academic funding to require data management plans and opportunities to incentivise corporate platforms (such as Amazon Web Services, Microsoft Ledger) to support new services were put forward. Curating unwanted legacy data and selling it back to the public or private sector was suggested as commercial opportunities.

The insurance industry has funded marine data aggregation, for e.g., CCTV cameras on beaches to identify numbers of people (accidental data) and linked with tides, currents, wind state, temperature and tidal risks. Such aggregations provide risk mapping for beach areas.

Innovations in methodology sharing have led research projects and organisations to contribute their workflows. 800 methodologies are now in a system collecting best practices in ocean observing. (OBP journal associated with Frontiers)  These are text mined to support new research-based connections.

Effective communication is key to making data work better. Global Fishing Watch translate complex data to maps and visualizations which get huge hits on their website, galvanising public opinion and driving expectations for public data quality.


Improvements or radical change? How well does the current research practice relate to the needs of the Sustainable Development Goals?  The global Sustainable Development Goals can be seen by scientists as the context rather than the drivers for research. A more direct connection between research and the major societal challenges concerning the ocean could support better collaboration and data sharing.

Science sub-communities have built up different practices and cultures. Data sharing is effective in areas dependent on it, such as work on big area ecosystems. Greater efforts could be made to document good practice. Norway and the UK were held up as operating effective data centres. The British Oceanographic Centre for example, collects and shares. 

Co-production The UN Decade of Ocean Science could work to support the development of research and data use in southern Hemisphere countries that generally lack capacity for advanced scientific research. Open data and partnerships with scientific agencies in western countries may help to bring down the cost of science and could help for example in the case of tsunami warning systems to better serve the needs of countries’ such as Vietnam, Indonesia and the Philippines, especially if positive connections between ‘traditional’ and Western science can be made.

Major projects such as AtlantOS may be at risk if engagement and partnerships are not effectively made at all levels – government, academic institutions and communities on the ground. Data that doesn’t conform to standard scientific models of good data can still be very effective for policy and decision-making. Data science skills can enhance the value of data and with attention to data delivery, protocols and standards, ensuring reproducibility. The SDG’s can help to back the voices of local communities.

A greater open-data ecosystem of interests has actively to be encouraged. Open data in cities and city government worked because others outside of government could take advantage of the data. Blue data economies are clearly emerging and so is citizen science. Distributed computing can support such initiatives. Zooniverse – people powered research and iNaturalist are two citizen science examples. 

Philanthropic research funders can support collaboration in international science by rewarding research proposals that drive progressive uses of data.  (These funders often have an overview of research interests from the proposals they receive and are aware of weaknesses such as a lack of effective networking within science communities.)

Private sector

The private sector historically has had little incentive to share data, especially data that has commercial value. Companies may have paid for permits to survey, they may be concerned about their sectors being targeted. What is the gain for aqua, oil or shipping industries? What models could be explored that support private companies to share their data? And how will industry data be used to stop damaging and destructive ocean industry?

There are examples: Seabed 2030 has begun to work with FUGRO, a leading provider of geo-data services.  FUGRO is sharing Bathymetry data to share data on costs and is achieving marketing rewards as a direct result. 

Some oil and gas companies have been incentivised to provide metocean (meteorological and oceanographic) data to help ensure safety of facilities operations. The off-shore renewable energy sector can include leasing arrangements with clauses for data sharing. Contractors can be mandated to share as part of contracts. If these arrangements are applied systematically across industry sectors to ensure level playing fields, then they can work well for business. An example on bathymetry was to put a value on mapping an area of seabed in order to deliver co-funding with the private sector. Business accreditation schemes could reward data transparency.

Private companies can also suffer from poor internal data management practices that require solving before they can effectively share data externally in open-data formats. A ‘shadow industry’– Esri, Planet, PlanetOS, Digital Globe, Google and other data science companies has emerged to help the private sector share data. This shadow industry is now supporting some of the major geo-service providers.

Intellectual property is a difficult issue. Should data be commercialised or made free? Within the context of the profit motive there may be marketing /publicity gains that can bring them the private sector into data forums. Ideas about new forms of capitalism and why business will value purpose as well as profit seemed relevant to discussions of ocean data. But private companies are building huge stores of data in unregulated ways. Companies such as Planet or SailDrone are collecting data and building capacity. Ethical and legal consideration can be wholly absent.

A narrative of a sustainable blue economy could assert a principle - if you operate in the ocean, you reinvest in ocean health. Contributing open data can be considered investment and non-proprietary environmental data can be shared.

A proliferation of cheap sensors used in the private sector, compared to the higher quality and more costly instrumentation used for academic research may present opportunities and risks. The scale of private sector data collection is vastly different from academic research (with for example over 2 million measurements per day across 700 sites – by a single aquaculture company).  Further steps may be needed for calibrating and validating data. Could peer review of data be more effectively developed?

Building relationships with industry to create access to data and to build financial models for data sharing are key. Questions of storage, processing, charging for data analytics, subscription, distributed ledgers, digital rights management are areas to be investigated.

Ocean related business can be incentivised to share non-proprietary environmental data. The tons of dark data held by the private sector could be brought into play for the wider ocean data community. Freely accessible data will inevitably generate new private sector activity and create new services. The potential for industry data to provide insight into the second order effects of human activity was described as enormous. The ‘shadow industry’ of ocean data science companies can provide a bridge between publicly funded research and private industry.  Sustainable blue economy principles can be used to leverage data sharing as part of an overall approach to supporting ocean health. Social media provides new sources of raw material from which new information and understanding about ocean change can be generated.

Open data?

To be open, data must be accessible, discoverable and transparent. Accessible means discoverable by machines with lineage and providence that can be trusted. It is the responsibility of the data provider to take active steps to make data open, discoverable and ‘known’. Social media can be used to make data more discoverable. The increase in article/data downloads as a result of social media use illustrates the role of marketing in making data available.

Can’t pay, won’t sell. Is IP a barrier or a lever? Producers and data owners can be unwilling to let others make money from their/publicly funded data? Even open access can be (mostly) restricted to academic use, not commercial use; scientists don’t want people making money on their data. Could digital rights be created? Musicians, artists and authors collect royalties – why not data producers?

Much data is not really open – a better understanding of digital rights management and the potential for blockchain technologies and smart contracts could help. Countries could be paid for offering up their data and financial incentives might work to make private companies clean up and offer up high-quality data.  Many institutions are skirting open-data mandates by publishing on servers without metadata and investing little in IT and data science. Publishing data that cannot be used or must be explained is not publishing! Putting data on websites does not equal open data.

Creating levels of openness from a level 0 – data specific to a scientific discipline, to level N – data of wider public interest, may help establish an architecture for data access. Promotion to recommend and rate data-sets and accessible abstracts to explain data is essential. Some scientists are wary – would an emphasis on data sharing put pressure on (take away) funds for fundamental research?

Would a new cheaper ocean data-oriented journal – endorsed by the Decade of Ocean Science - work to disrupt the academic publishing industry (selling publicly funded science). Or could more pressure to include the cost of publishing data in an open-access journal as part of the grant be applied?

There are precedents for data platforms that allow contribution and extraction of data.  Networked flows of information about road conditions, Wikipedia, open street maps and ambitions such as the UK’s Open Data Institute provide leads.  Are there opportunities for platforms to support analytic services, for e.g. in Face Book Learner Flow and to add IP or share derived analytics. Is there an education component? The unique character, scale and diversity of ocean data precludes wholesale import and application of existing models but ideas to suggest change clearly exist.

Global internet access for the many is via mobile phones rather than by servers.  Phone apps will continue to be a useful data access/distribution medium for the immediate future. Intuitive graphic visualisation of data, easy to use tools such as story maps and collecting feedback from users can create the means for data sharing. The apps that people want to use tend to become the standards.

Open-source is coming – these discussions may simply be overrun by those who can pick up the facility to capture the data and create narratives. Scientists must allow their data to be used for non-academic purposes. The imperative must be to create material that journalists and film-makers can use.

Understanding the challenges - conclusions

  • There must be an emphasis in getting data into the hands of people who can use it – including in all policy and in the global south. There is potential value in accidental data, lower resolution data, and ‘good enough’ data.
  • Science cultures are mixed, motives to share for the common good work alongside publishing incentives that confer reward on individuals who must do not release their work except via slow publishing processes.
  • Future business models – to protect or open? There are uncertainties about financial incentives - to build royalties or to let data go free. There are opportunities for digital rights management that further use (Creative Commons). Smart contracts can grant intellectual property to data collectors and different kinds of licensing that allow people to make money. Taxpayers can still benefit from commercial money making if the result is to support a sustainable blue economy. Free, no strings data use can drive creative innovation. The issue of intellectual property (IP) for science data requires more investigation – can IP be monetised, or should it be made open-source? What are the opportunities for digital rights management to encourage greater open access to data?
  • The quantities of industry data and open source data providers are rising exponentially: a new generation of data scientists, journalists will use open source tools. The open-source world will inevitably change expectations.
  • Institutional and political issues can hamper the release of raw data. Aggregate data can be a poor substitute for fuller datasets and a face-saving substitute for industries concerned not to release data. The UN must not let industries attempt to use GDPR to restrict data openness.
  • The ‘shadow industry’ of ocean data science companies provides a link to the private sector. Sustainable blue economy principles can include and recognise data sharing as part of the overall approach to supporting ocean health.
  • Science can get better at networking and understanding what research is being done and what data is available.

What next: ideas, partnerships and commitments

Philanthropic ocean research funders will take forward discussions on data.

REV Ocean will release all research data and set a standard. It will employ data engineers directly to make data accessible. Where possible data will be broadcast in real-time, and where not, all data will be published within a year unless there are good reasons.

Ocean scientists and data science – Data scientists, valued in the private sector, need better recognition in the academy. Many universities are developing data science in biological, geological and spatial work. Don’t let oceanographic research get left behind.

New approaches to tackling incomplete data. Could experimental techniques to predict language patterns be applied to problems of incomplete data to make ‘good enough’ inferences about the missing data (i.e. building on the work of Aza Raskin)?

Could game theory help - The concept of Shapley value could have application for how we collect and analyse ocean data https://en.m.wikipedia.org/wiki/Shapley_value

Work on the incentives for the private sector to work with scientists, e.g. exploring the links between geo AI & blockchain.

Understand the open-source mindset. The motivations for sharing data, code and analytics and the opportunities brought by open-source point to value exchange which is not just about money.

What does transformative mean? Is there a broader re-valuing of economic systems? The book Radical Markets: Uprooting Capitalism and Democracy for a just society. http://radicalmarkets.com/ was suggested as a prompt for thinking through transformation.

Innovations in the use of data in visualization and communication and to tell stories based on data is a significant area to be developed. [There are many examples. Edwards Tuft’s website / Book: Information is Beautiful  https://informationisbeautiful.net/ Hans Rosling’s work: GapMinder.org as well as ESRI’s StoryMaps.]

We were asked to imagine the end of the next decade - 2031 – data and money are converging into digital assets, cloud computing is more ubiquitous with the IOT and the growth of sensor technology. How will the ocean data community leverage its assets? This is about transformation rather than efficiency gains.

But do we want to ‘tokenise’ every cubic metre of the ocean and the atmosphere above it and create new categories of currencies? Do we want to determine natural resources in terms of economic value?  For e.g the economic value of the blue whale. Are there better alternatives in the future of markets?

Incentives to get first class data out quickly, to create value for the data and opportunities to cite the data – must be created. The work of Kerstin Lehnert at the Interdisciplinary Earth Data Alliance provides good leads.

A new paradigm is coming – how will the ocean data community respond? 


This Note reflects the Chairman’s personal impressions of the conference. No participant is in any way committed to its content or expression.


CHAIR:  Dr Linwood Pendleton
Conservation and Innovation Advisor, Ocean Data Foundation; Global Oceans Lead Scientist, World Wildlife Fund; and International Chair of Excellence, European Institute for Marine Studies.


Mr Peter Pissierssens
Head, IOC Project Officer for IODE, and IOC capacity development coordinator (2007- ). Formerly: Head of IOC’s Ocean Services (2000-2007); Marine Information Management, IOC’s International Oceanographic Data and Information Exchange (IODE) (1992-2000).


Mr Neil Holdswoth
Head of Data and Information, International Council for the Exploration of the Sea


Dr Pier Luigi Buttigieg
Data Scientist, Alfred Wegener Institute for Polar and Marine Research; Co-chair, Earth Science Information Partners (ESIP) Semantic Technologies Committee; Operations Committee (Editorial), Open Biological and Biomedical Ontology Foundry and Library; Technical lead, IOC-UNESCO Ocean Best Practices System; Essential Ocean Variable lead (Microbial), Global Ocean Observing System Biology and Ecosystems Panel.


Mr Mogens Mathiesen
SVP & Head of Technology and Partnerships at the Ocean Data Foundation. Co-Founder of Arundo Analytics. Ocean Engineer.

Professor Sissel Rogne
CEO /Director, Institute for Marine Research and professor at Department of Biosciences, University of Bergen.


Mr Simeon Archer-Rand
Senior Marine Advisor, Blue Belt Programme, Centre for Environment, Fisheries and Aquaculture Science

Mr James Arroyo
Director, The Ditchley Foundation (2016-). Formerly: Her Majesty's Diplomatic Service (1990-2016): Director for Data, Foreign and Commonwealth Office (2014-16); Director, Europe and Economic Issues (2012-14); Deputy Director, Cyber, Knowledge and Information (2011-12); Political Counsellor, British Embassy, Paris (2011-12).

Mr Jon Clay
Documentary filmmaker specializing in wildlife and environmental stories, currently working at Silverback Films on Our Planet - a collaboration between Netflix and WWF which aims to ignite a global conversation about the recovery of biodiversity and humanity's journey to sustainability.

Mr Dave Dyke
Technology Consultant, Tech HQ (July 2019- ); Chief Executive Officer, See2 Ltd (September 2018- ); Chief Technology Officer, Agilit Ltd (December 1996- ).

Mr Tim Glover
UK Projects Director, Blue Marine Foundation.

Professor Penny Holliday
Associate Group Head of Marine Physics and Ocean Climate, National Oceanography Centre.  Science Coordinator and Principal Investigator for large national and international research programmes.  Chief Scientist for oceanographic research ship expeditions.

Dr Clare Postlethwaite
Co-ordinator of the Marine Environmental Data and Information Network (MEDIN), an open partnership of organisations working together to increase access to the UK's valuable marine environmental data.

Professor Alex Rogers
Science Director, REV Ocean, Norway; Visiting Professor, Department of Zoology, University of Oxford; Senior Research Fellow, Somerville College, University of Oxford; a member of the Group of Experts of the Prime Minister's High-Level Panel for a Sustainable Ocean Economy. Author, The Deep: The Hidden Wonders of the Ocean and How to Protect Them (Wildfire, 2019).

Dr Josh Veitch-Michaelis
Postdoctoral researcher in drone image analyses, Liverpool John Moores University.

Dr Lucy Woodall
Senior Research Fellow: Department of Zoology, University of Oxford, and Principal Scientist: Nekton Foundation.


Mr Steven Adler
Chairman, The Ocean Data Alliance (2017- ) and Advisory Board Member for the US Department of Commerce (2015- ). Formerly Chief Data Strategist, IBM (2000-2018); a member of the IBM Academy of Technology Leadership Team; a member of the US Commerce Department Data Advisory Council, New York Civil Liberties Union Board.

Dr Timothy Bouley
Founder, Emergence Ocean Biotechnology; Advisor, Oceansafe Genomics Horizon Scan. Previously founder and head of World Bank climate change and health program and coordinator of Global Partnership for Oceans. Broadly, working to explore and apply biological ocean data to improve human health.

Dr Annie Brett
André Hoffmann Fellow, Stanford Center for Ocean Solutions and World Economic Forum Centre for the Fourth Industrial Revolution.

Mr David Kelly
Chief Executive Officer & Chief Technology Officer, Innovasea.  Formerly President and Chief Executive Officer, Bluefin Robotics.  Board of Advisor member Hefring Engineering.  Former Board of Advisor member Riptide Autonomous Solutions (sold to BAE Systems), and Open Water Power (sold to L-3 Technologies).

Mr Eric King
Director of Operations, Schmidt Ocean Institute.

Dr Dawn Wright
Chief Scientist, Esri. Fellow, American Association for the Advancement of Science, of the Geological Society of America, of Stanford University's Leopold Leadership Program. Professor of Geography and Oceanography in the College of Earth, Ocean, and Atmospheric Sciences at Oregon State University.


United States of America/Canada
Dr Carlie Wiener
Director, Marine Communications, Schmidt Ocean Institute. Formerly: Communications Manager, Centers for Ocean Science Education Excellence (COSEE) Island Earth; Research and Outreach Specialist, Hawaii Institute of Marine Biology Northwestern Hawaiian Islands Research Partnership, University of Hawaii. Writer and broadcaster.


United States of America/United Kingdom
Mr Aza Raskin
Co-founder, Earth Species Project. Aza Raskin has a diverse background, ranging from dark matter physics to music, design to entrepreneurship. He is a cofounder of The Earth Species Project, a non-profit AI research institute bridging technology and the animal kingdom. He is also cofounder of the Center for Humane Technology, which is leading the charge in reversing the digital attention crisis and realigning technology with humanity's best interests. Previously, Aza Raskin helped build the web at Mozilla as head of user experience, was named to Inc and Forbes 30-under-30 and became the Fast Company Master of Design for his work founding Massive Health, a consumer health and big data company. The company was acquired by Jawbone, where he was VP of Innovation. Before that, he founded Songza.com (acquired by Google), and studied dark matter physics at both University of Chicago and Tokyo University.