Big Data in International Trade: what is working & what we have learnt so far
Big Data swept across most industries in the last decade. Gartner defines Big Data as the high-volume, high-velocity and/or high-variety of information assets that demand cost-effective, innovative forms of information processing to enable enhanced insight, decision making, and process automation. International trade transactions produce such Big Data, as they generate large volumes of paperwork and digital information worldwide every minute, from various sources, such as customs, shipping lines, borders, warehouses, distributing logistics nodes, port terminals and dockyards etc.
Figure 1: The 5 Vs characterising Big Data
Given the high density of data linked to trade, many international organisations predicted that Big Data would revolutionise the international trade industry. For instance, the World Economic Forum (WEF) predicted that Big Data would transform global trade, starting with logistics optimisation and predictive analytics that would enable businesses to be more efficient, in an article published in 2015[1]. Now that Big Data has permeated our economies, the question that needs to be asked is how is Big Data being used, and what have we learnt so far on Big Data’s applications in international trade?
Big data provides the opportunity to reduce the problem of scarcity in international trade. The fundamental economic problem in a world bounded by finite resources is that of scarcity. Economics is derived from the Greek word “oikonomicus” meaning to manage household resources. At the global level, economies have increasingly specialised, and goods and services pass back and forth through borders multiple times before being finished. Knowledge of suppliers and buyers has become increasingly complex, not to mention dealing with several intermediaries in each transaction and managing exogenous factors (e.g., unforeseen events or economic reforms) that may strengthen or disrupt these networks of transactions.
In the field of trade, Internet of Things (IOT) devices, using geospatial tracking, can record in real time movements and divergences in those movements, and exchange cross border electronic information digitally, offering a more secure, effective, speedy, and economical manner for collecting data. IoT systems can also be designed to ensure the integrity of data about the physical condition of things such as packaging, vehicles, and containers (UNECE, 2022) [2]. As the World Customs Organisation emphasises the uses for detecting “Complex patterns – such as movements in the price of goods, illegal drug traffic, import routines for smuggling counterfeit goods, potentially high-risk conveyances or the frequently occurring misclassification of goods”. Cognitive applications, such as anomaly detection systems that apply neural networks, understand the “deep context” of a particular situation and identify pertinent patterns using both structured and/or unstructured data (WEF, 2020)[3].
Predictive analytics, derived from information gathered from insights from past buying behaviour, feedback from sellers and buyers, payment and invoice risk profiling, and other techniques, allow businesses to predict future market behaviour. Moreover, the rapid analysis of large volumes of information leads to more accurate decisions. Ultimately, Big Data holds significant value in economic planning and international trade.
Big Data has enabled production costs to fall for exporting and importing firms. For example, on average, the cost of returning a product (in e-commerce) is 1.5 times that of the actual shipping cost[4]. Big Data analytics can help firms identify the goods most likely to be returned and take the necessary steps to reduce losses and expenses. Big Data analytics has also reduced advertisement costs by allowing for the selection of privileged channels to direct market campaigns. Furthermore, Big Data analytics enables businesses to manage better the factors of production (land, labour and capital) and improve the efficient use of these assets.
As a testimony to the opportunities opened by Big Data on the international scene, customs offices worldwide seized the opportunity to leverage Big Data technology. New Zealand Customs Services developed a new strategy for intelligence-led decision-making based on their collected data. Hong Kong Customs and Excise Department started the generation of massive datasets to gather insights for timely decisions and long-term planning. Canada Border Agency Services (CBAS) also launched a project to analyse high-volume structured data to address complex problems related to its border management. Moreover, in the United Kingdom, Her Majesty’s Revenue and Customs (HMRC) initiated a project to collect accurate data and analyse commercial data flows in supply chains.
Private companies also have leveraged Big Data to offer new service offerings in international trade. S&P Global, for instance, built a platform called Panjiva[5] powered by machine learning and data visualisation using shipment data. Listthe[6], a company calling itself the “U.S.A Container Spy” uses the shipping line data for market research, competitive analysis and identification of source factories. TRADE Research Advisory (Pty) Ltd, a spin-out company of the North-West University, developed an analytic model called TRADE-DSM (Decision Support Model) to assist trade facilitation for private firms. The model discovers realistic export prospects for export-ready and active exporting businesses looking to increase their sales reach into international markets. The infographic (Fig 2) illustrates details of some initiatives derived from Big Data.
Figure 2: Examples of Big Data projects related to International Trade
Implementing Big Data projects has its challenges. One major challenge of Big Data’s application is the setup of a Big Data infrastructure. Gathering of Big Data requires, amongst others, capital, adequate legislation for data security, facilities and human potential for data collection, data storage, data analysis and data output. Challenges include the availability of skills, adequate sources of power, and the ownership of data farms and exabyte facilities. Missing or incomplete legislation protecting users from data misuse greatly hampers trade in services and data collection from it. For instance, the GDPR (General Data Protection Regulation) limits the type of data gathered by organisations and creates certain issues for data collection because individuals have the right to have their information removed from databases even after permitting to have it included. Restrictions around data transfer may consequently cause erroneous predictions, which goes against the concept of Big Data.
Regrettably, even though Big Data projects can be successfully implemented, various Big Data projects fail due to a lack of clear, explicit, and agreed goals and outcomes, focusing on the technology instead. According to VentureBeat (2019)[7], 87% of data science projects are never completed, and Gartner predicted in 2019 [8]that insights from analytics will only yield business outcomes in 20% of cases by 2022. This poor success rate can be attributed to a variety of factors. According to David Becker (2017)[9], project management and organisational issues account for 62% of big data project failures. Top managers must therefore possess the right vision to develop the right project in the right way. Without a good vision, projects might solve the wrong problem, have no real value addition, and fail to find the right group of candidates with the adequate skillset for the job.
Figure 3: Factors resulting in Big Data project failures
Big Data technology is at the core of this optimisation and streamlining of the process of supply chains. In an era of ecological change, the nation’s priority is to achieve net zero emissions while keeping the momentum of current international transactions work is underway and it is expected that Big Data will play crucial roles in the digitalisation of services as well as vehicle telemetry to cut down on carbon emissions. Very soon, in 2023, Maersk will operate the world’s first carbon-neutral liner vessel due to fast-tracked advances in data-driven technology. Furthermore, new possibilities are coming from the analysis of transport data. Satellite tracking of the shipment of the goods allows to forecast trade statistics in 2 to 3 months compared to 6 to 12 months using customs reported data. Digital is being referred to as Oxygen[10] of the energy transition.
Recommended by LinkedIn
While the technology is very advanced, there remains constraints on the capacity of storing volumes of all forms of data (streaming devices, IoT devices, and social media etc), challenges exist around intellectual property rights data protection legislations, and of course addressing some inherent biases in the data used. As economies continue to grow and develop regulations, security layers, increased capacity around the technology, it is expected that Big Data will soon yield its full potential.
Big Data will continue to support innovative technologies such as Artificial Intelligence (AI) whose Machine Learning and Deep Learning Models are highly dependent on Big Data. Related to international trade, AI opportunities around optimisation and automation of existing supply chain operating models, development of smart solution as well as reducing cost and waste around global value chains[11] are among the most relevant immediate opportunities. Given the current challenges that exist around data protection legislations, it is predicted that the data collection process will become more ethical in the future guides by software, best practices, and regulations.
In a world where practically, everything is digitalised, Big Data will continue to play a key role in further shaping international trade in the coming years, not only by bringing insights and recommendations to trading stakeholders but also defining better, more efficient as well as new channels to trade.
Paul Baker is the founder and chairman of International Economics Consulting Group. He is a consultant for various governments in developed and developing countries, an adviser on global corporate strategies to multinationals, and a Visiting Professor at the College of Europe. Paul is an expert in the Working Group of the World Economic Forum’s (WEF) Digital Flows Initiatives, an Expert in the WEF/WTO’s TradeTech Working Group on trade technologies for trade, and is on the Board of the United Nations Economic and Social Commission for Asia Pacific’s Trade Intelligence tools. He is also a member of the UK’s All Party Parliamentary Group on Trade and Investment, and a regular contributor to the UK Parliament’s Trade Select Committee, and UNESCAP and UNCTAD panels and events regarding trade impact analysis.
References
[3] WEF (2020). Mapping TradeTech: trade in the fourth Industrial Revolution. Insight Report. December