Embracing the open source philosophy: community creation and sharing of standards

Adopting a targeted but resolute open source strategy is now a necessary step in the quest to share standards and reach critical mass

In sum, open source is a means of opening without losing control: making public the code of some carefully selected subsystems seems a good strategy for an industry to become more flexible in terms of development and product testing, to innovate at the right level, to impose its standards and to have its solutions adopted on a massive scale.

The ways that we interact with computers are poised to change nearly as dramatically in the coming decades as computers themselves have over the past several decades. We’re in the early stages of an evolution from highly complex, techno-centric approaches to much more intuitive, anthropo-centric interfaces. The coming changes will almost certainly impact the way companies do everything from operating industrial equipment in their own production facilities to designing the products that they sell to customers. Here we have sought to outline our view of the current landscape and emerging technologies, relevant companies and the use cases that they’re enabling.

The term Human-Machine Interface (“HMI”) describes the methods in which people interact with computers, and is a subset of the broader User Interface (“UI”) field. This includes independent components and as well as unified systems which encompass:

  • Input technologies such as gesture recognition, eye tracking and voice control
  • Output technologies such as ultrasound, electro-tactile stimulation, and vibrotactile stimulation
  • Analytical techniques such as Machine Learning & Natural Language Processing
  • Unified systems which combine those technologies such as Augmented Reality (“AR”) & Virtual Reality (“VR”). Note: we have chosen to focus on AR technologies here. Many of the component technologies we’re discussing have applications in both AR & VR, but they are distinct ecosystems that are seeing interest from very different user bases. Even though “AR/VR” are commonly lumped together as a single topic, we try to avoid making a similar comparison.

The QWERTY keyboard, first developed in the 1870’s, was one of the first attempts to mechanize input from an individual user into a medium that could be stored & shared by many in the future (a letter or book). Designed to avoid routine collisions between keys which slowed typing speeds, it demonstrates the type of technological barriers that were a focus in early HMI solutions. These early approaches required the user to adapt their behavior to fit the machine in ways that were unnatural at best, and highly complex at worst. From punch cards to keyboards, the use of these technologies was reserved either for specialists or for those that specifically acquired a specific skillset (remember typing classes?).

The second wave of HMI kicked off in the early 1990’s, at which point GUIs and display technology had developed sufficiently to allow touch screens to be introduced. This was the first time that the medium through which humans communicated with machines became transparent: the combination of input & output mediums into a single interface gave users the impression that computers could understand human touch & motion the same way other humans do, without requiring a foreign object to mediate the process.

The dramatic decline in the cost of computing driven by consistent innovation in semi-conductors and followed by the smart phone revolution has led to the proliferation of small yet high-powered computers with high-bandwidth connections nearly everywhere that they can be useful to humans today. The lack of space that accompanies a traditional stationary computer (i.e. your desk!) and the capability of the latest computers have simultaneously required and enabled new, intuitive user interfaces to be developed.

Why does this matter?

The aforementioned changes have not been lost on those with large pools of capital to invest, and both venture capitalists and large tech corporations have not been shy about placing bets in the HMI space. VCs have invested more than $1bn per year into HMI technologies for each of the past three years. Following a brief investment spurt into 2D display companies in 2011, excitement about AR (and VR, although to a lesser extent) has accounted for the clear majority of inflows into the space.

Major tech corporations have also been reasonably active making acquisitions, albeit with a preference for earlier stage companies to bolster their technology portfolios and talent pools. Intel (Omek and Indisys), Apple (VocalIQ), Google (Flutter), Freescale Semicondutor (Cognivue), and Facebook-owned Oculus (Pebbles Interfaces) have all targeted HMI startups in sub-$100m transactions.

Applications of HMI in Industry 4.0

Lots of attention has been focused on the application of HMI technologies in consumer use cases like virtual assistants. Instead of repeating that message, we will focus on a few specific applications which are impacting industrial companies.

Industrial machine control is an obvious area of focus. The machinery utilized in industries ranging from oil & gas to healthcare is often plagued with archaic user interfaces which can reduce productivity on the machines which are commonly amongst the most expensive in the facilities where they’re located. Integrating state of the art HMI into these assets can create new efficiencies on many levels:

· Machine operators benefit from receiving physical feedback from haptic technologies, or as wearables or AR glasses free their hands and attention to focus on their primary task. New levels of safety are often achieved, which is far more important than efficiency ever will be.

· Maintenance professionals can quickly visualize machine status in real-time, enabling them to prioritize their workload and anticipate tooling and material requirements for individual tasks. Remote collaboration also allows off-site specialists to consult or guide local technicians through tasks that would require travel otherwise.

· Managers are empowered to quickly survey the status of all operations inside a given plant or facility and receive immediate notification when potential bottlenecks arise.

Across these categories, deployment of advanced HMI technology can help attract young people to join an aging workforce (a trend which is consistent with all Industry 4.0 tech).

The automotive sector has also become a rapid adopter of new HMI technology. Automakers all strive to deliver a differentiated UX that breeds demand for a given brand/model. New HMI technologies are most commonly found in the infotainment systems of high-end models, but the redesign of mundane features like door openers has also demonstrated their applicability.

Touch screens and gesture recognition technology have emerged as favorites for automakers. You’ve likely noticed that the touch screen in your car isn’t as sleek as the one on your smart phone. This is because automakers have favored resistive displays over the capacitive screens found on most phones. The long design cycles (7–10 years), the need for longevity (12 years, 200k miles), and the intense focus on cost all factor into a slow rate of adoption of new technology. However, suppliers like Continental are working to commercialize infrared screens. Infrared sensors are also being evaluated to monitor driver attentiveness in Advanced Driver-Assistance Systems.

Basic gesture recognition sensors are also being applied for tasks outside the car. Drivers approaching their parked Ford Escape with an armful of groceries can now open the tailgate by passing their foot underneath the rear bumper instead of setting down what they’re carrying.

Advertisers are also adopting new gesture recognition and eye-tracking technologies in Digital Out of Home (“DOOH”) advertising. Digital displays are more expensive to deploy, but have been shown to drive 2.5x more revenues per incremental dollar spent. The most advanced ad displays enable a wall-mounted advertisement to become a point of sale. In other cases, they can allow a viewer to access additional content related to the ad via an app download or launching a website from a QR code. Higher engagement combined with new analytics which enable advertisers to target precise groups of users have resulted in new revenue for both media companies and their customers alike.

Conclusion

Many of the use cases identified here may not feel like they’re on the cutting edge of technology in today’s internet age. However, the computerization of industry and manufacturing is one of the key theses underpinning the Industry 4.0 trend. As such, the integration of technologies being utilized in consumer electronics would drive a significant investment and productivity boom, and that’s to say nothing about the potential of state-of-the-art technologies currently in development.

The design and replacement cycles for the types of large physical assets we’ve covered are certainly longer than those in the consumer electronics industries. However, the highly capital-intensive nature of the industries that are being transformed means that dramatic improvements in cost structure and new revenue generated through operating efficiencies are possible.

When combined with the large budgets that are often available to address very specific problems, the opportunity for new HMI technologies to find meaningful customer bases within the industrial and enterprise community should not be overlooked.

Please find here the french version.

A subway that runs at the speed of sound? That’s the bold claim made by Canadian startup TransPod when speaking to a room full of inquisitive journalists, students, venture capitalists and industrialists at the Arts et Métiers Hotel in Paris on June 20. Founded in 2015, TransPod is expounding upon the Hyperloop concept with hopes of reshuffling commercial transit maps starting in 2025.

A new mode of transportation to compete with planes and trains

TransPod is taking on Elon Musk’s widely publicized concept whereby pods are propelled over layers of air through a partial-vacuum tube claiming it can correct its presumed flaws. For example, instead of shooting the pod through a tube at 10 Pa, it wants to raise the pressure to 100 Pa to streamline the infrastructure, even if it means increasing the per-vehicle energy consumption. Another preference that sets it apart from competitor Hyperloop One is that TransPod is developing an active levitation system that would reduce jerk for better passenger comfort.

The goal is to build an interurban transit system of pods that can accommodate roughly 30 passengers and travel at speeds of up to 758 mph with a lower carbon footprint and ticket price than airplanes. They have been looking at a link between Montreal and Toronto. TransPod also believes its system would realistically be able to compete with road freight on heavy traffic routes.

In 2016, TransPod raised $20.2 million in seed money from the Italian investment fund Angelo Investments and began talks with transit agencies and industry players for a Series A round of $50 million that could become a reality this fall. The company plans to have prototypes ready in 2020 and be fully operational between 2025 and 2030.

From futuristic concept vehicle to resilient mass transit system

It is hard not to admire the determination of TransPod’s founders to get their idea off the ground and forge partnerships between industry and the financial world. But there still remain open questions on the system’s technical and economic viability. Their answers to audience questions on that balmy June evening were not always reassuring. For example: When traveling at a speed of 758 mph, can you ensure a 90-second headway? Could passengers withstand deceleration in case of emergency braking? How to manage pods U-turn at the end of the trip?Who will fund the infrastructure?

Indeed, speed isn’t everything and TransPod is realizing that the propulsion breakthrough requires many other not-so-sexy inventions like reasonable ways to evacuate passengers if the system goes down. This lack of a systemwide view is a peculiarity we often see in startups proposing disruptive transportation technology: SeaBubbles definitely hit Viva Tech by storm when it unveiled its first bubble, but now it has to come up with a turnkey system that includes charging and special safety docks designed for mass transit; skyTran, whose two-passenger monorail pods were supposed to crisscross the skies of Tel Aviv, didn’t take off. Its founder, a NASA veteran, may have overlooked the fact that while an Apollo mission only has to work perfectly once, a public transportation system has to operate flawlessly and safely millions of times.

That’s in contrast to the rail system — and its trusty sidekick the wheel-on-rail principle. Though often chided for being stuck in the past, its undeniable decades of experience regarding operation and degraded modes management operating a system are the cornerstone of a resilient and sound public transit system.

Thus, in an environment where vehicles like buses and subways have virtually become commodities, the know-how integrators have about operating a mass transit system over time seems to be a majordifferentiator. Yet like all legacy businesses, they struggle to innovate.

What’s inspiring about TransPod’s venture?

A healthy dose of skepticism about the Hyperloop system isn’t actually a valid reason to throw the baby out with the bathwater. Like TransPod CEO Sébastien, who humbly considered people’s comments so he could improve upon his idea, let’s focus on how we can be inspired by TransPod and others.

To begin with, the few million dollars they raised should enable the startup to complete some work packages. Its main advantage is to start from scratch, whereas traditional manufacturers have to deal with a legacy wich is both treasured and unwieldy. TransPod CTO Ryan doesn’t get bogged down in the principles of railway signalling. He catapults himself into a world where pods will be communicating with each other through artificial intelligence and veillance flux. Some compelling technology bricks will likely emerge from this R&D.

Next, TransPod is managing to attract industrials that probably see it as an opportunity to design out of the box. For example, Liebherr-Aerospace has just signed a contract to engineer the cabin system and IKOS is taking over the propulsion system. This deserves recognition at a time when corporates struggle to implement successful open innovation models.

Lastly, the way TransPod is insisting upon passenger experience may not just be a tactic to hide its technological flaws. The approach reminds us how fundamental it is to tease riders. if we want mass transit system to remain the backbone of mobility in the future.

From that perspective, it’s easier to understand that the two worlds mentioned here — startups with futuristic transportation projects and conventional mass transit players — are more complementary than adversarial.

True disruption probably lies at their crossroads.

On average, four private companies valued more than one billion dollars were born per year in the 2000’s. By 2012, there were about thirty such “unicorns”, a term first used by Aileen Lee of Cowboy Ventures, a seed-stage venture fund. Today, Crunchbase lists 261 members of the unicorn club, including 49 which were added in 2016. The number of unicorns is skyrocketing due to the current market conditions and because investors believe that these businesses are going to change the world and, above all, provide them with a large ROI.

After combining data from three different databases[1], and excluding individuals such as Angels, we identified 298 companies from diverse types that have bet on, at least, one unicorn company, three fourth of them having been founded after the 1990s. Unsurprisingly, Private Equity funds (which include both Venture Capital funds and Private Equity funds) represented the largest group, and accounted for 65% of all unicorn investors. Corporate Venture Capital funds came in second at 17% and were followed by banks, insurance companies, governments, incubators, pension funds, angels’ groups, and others. The “unicorn hunter” with the most unicorns in its portfolio was the US-based Tiger Global Management, which held stakes in more than 30 unicorns.

Who are the originators of this new phenomenon? What is the profile of the investors who have contributed to establish these “super-companies” which, for most of them, do not make any dollar of profit?

Geography

More than half of unicorns are based in the United States and more than a quarter of them are in China. The geographic dispersion of investors is relatively similar: two thirds of the investors referenced above are headquartered in the US, followed by China and Hong-Kong (15%), the United Kingdom (3.4%), Germany and Japan (2.0% and 1.7%, respectively).

The significant share of the United States in this ranking can be explained by the history of Venture Capital. Contrary to Europe and Asia, the US did not suffer from massive destruction during World War II: considered as the big winner, the US came out strengthened of the conflict. The unemployment dropped and the country got out of the Great Depression, while the rise of prestigious universities like the MIT, Stanford or Harvard was combined with enormous defense-related government spending during the Cold War to foster the development of innovative projects that required considerable amount of financing.

It was a Harvard professor, Georges Doriot, who created in 1946 the first investment fund to invest in young and risky companies: the American Research and Development Corporation. This marked the birth of the Venture Capital market in the US, and it was nearly 30 years later that the VC industry began to develop in Europe during the 70’s. Around the same time, the creation of the NASDAQ in 1971 created another new mechanism for American technology companies to raise large amounts of capital, which further emphasized the domination of the US in the field of new technologies.

Within the US, two states distinguish themselves: California is first-ranked with 45% of the US investors followed by the state of New York which represents 21% of American unicorn investors. Silicon Valley alone, and notably San Francisco, gathers five of the ten best unicorn hunters.

Role

The financing rounds required to support Unicorns are often extremely large, often in hundreds of millions of dollars if not a billion or more dollars. There are very few investors that can write checks of that size on their own, so the sheer size of these rounds often requires a large number of investors to participate. In most cases, there is only one lead investor for a given round, although there are some cases in which 2–3 investors co-lead a round. Regardless, the number of “follow-on” investors involved in a given round is often much higher than the number of lead investors. The average number of financing rounds led by an individual Unicorn investor is less than 1 out of 3 of all the deals that they’ve invested in.

In addition, there is a low correlation between the frequency of investment (i.e. the number of deals closed by year) and the number of invested unicorns: being an “active” investor does not seem enough to catch the stars!

Sector

The average number of unicorns a given investor is likely to support is lower for industry-focused investors, or other funds with narrow investment focuses, than for general-purpose investors. In fact, three fourths of the unicorn investors have a general-purpose investment strategy. However, most of the unicorns have emerged from a limited number of sectors, namely software and consumer internet, followed by e-commerce, financial services, healthcare and transportation.

To conclude, maximizing your chances to bet on unicorns means to:

  • Be a Private Equity fund with a focus on investing in venture-stage companies
  • Have been founded after 1990
  • Be based in California
  • Be a follower, or “passive”, investor
  • Have a general-purpose investment strategy
  • Invest mostly in USA and China.

[1] Crunchbase, CB Insights and Thomson One Banker

The Grid is becoming smarter, but is that enough for incumbent energy operators and utilities to meet the new challenges they are facing?

Smart grids are energy networks that can automatically monitor energy flows and adjust to changes in energy supply and demand accordingly. We all see nowadays a wide-spread use of the words “smart grid” or “edge grid”, also outside the energy community strictly speaking. Also, huge investments are expected: analysts and think tanks are debating about the latest projections — recent World Economic Forum report speaks about $2.4 trillion value created by innovation in the electricity industry over the next decade. So how did we get to that point?

Solar and Wind

About 10 years ago, the cost of electricity generated by sun rays hitting polysilicon crystals was approximately $500/MWh. Today the same electricity costs about $50/MWh (and in some cases less[1]), and the cost of wind energy has followed a similar path. The sharp reduction in the cost of the renewable generation has strengthened its position as a credible alternative to traditional sources (especially gas and coal). Even after the end of generous incentives programs, the result today is that most of the new installed capacity, at least in developed countries, is Renewable (see this nice article about “solar singularity” by Greentech Media).

That wouldn’t impact the grid that much, if not for one main reason: intermittency. Solar and Wind are not “dispatching sources”, or in other words are not “programmable”: you cannot say to the Sun <<shine!>> and to the Wind <<blow!>>. The immediate result of that intermittency is that the electricity grid now doesn’t know in advance where the energy will be produced, or how much, and that’s an enormous challenge for grid operators.

Distribution of energy generation

Not only energy generation has become cheaper through renewables, but it has also become increasingly distributed on a scale that never existed before. Now, in some countries, the electricity needed to run a washing machine or watch TV can be produced on your roof at the same cost (or cheaper in some cases) than the electricity you can buy from the grid.

While this is convenient for the clients it adds a second challenge for the grid and the operators: we are moving from a network structured around centralized production plants to a network of many disparate sources.

Energy storage

Historically, there has not been any obvious ways to accumulate energy produced. As a result, massive quantity of renewable electricity enters the grid, outnumbering the demand (especially at night, when wind blows the most and the demand is the lowest). As the value of electricity is typically set by tradeoff between demand and supply, negative prices are being observed with increasing frequency, or energy is exported from countries with excess of renewable production (Germany for instance) at very cheap prices. Fortunately, also thanks to the massive volume expected from automotive applications, Li-ion batteries are now less than $230/kWh (McKinsey study, 2017[2]) compared to $1000/kWh 5 years ago. Now renewable asset owners have increasingly the choice of trading the electricity they produce (i.e. deciding whether to store, consume or sell), again adding another challenge to grid operators and incumbents: the energy produced will not necessarily be entering the grid immediately as we have the option to store it within batteries.

Major new dynamics such as A/ affordable renewable generation, increasingly distributed on a large scale, B/new energy exchanges and trading opportunities, C/ hard time to predict demand and offer and their topography, are creating massive challenges to incumbents. Is the “smart grid” enough to cope with all that?

Smart meters were only the start

Historically, utilities have generated and monitored quite some data from their production assets, the substations at both low and mid/high voltage level, and to some extent at the point of consumption. But now, many scattered production assets belong to customers, and monitoring only traditional assets is not enough. Hence why, for a few years now, utilities have seen the interest to deploy “smart” meters to learn more about customers’ behavior, both from demand and production points of view.

While smart meters are considered entering the mass market (see chart above) many more connected devices are beginning to be deployed by early adopters. The new devices are no longer limited to consumption-related devices, with rooftop solar and behind-the-meter battery storage both generating “production” related data. Information is increasingly more accessible and transparent to any player, even outside the traditional energy value chain, including end users and consumers.

Utilities are finally learning that it is not just about being able to deploy and read smart meters. In fact, the complexity that has stacked up on the energy ecosystem is far greater to be dealt with only with smart meters.

Utilities are expected to respond with massive IoT and data-related investments

In 2017, IDC predicts[3] that utility investments into smart grid technologies will represent $56Bn of the $66Bn IoT investments they are expected to make this year. Three major use cases are likely to drive such investments[4]: smart meters, distributed generation and demand response. The investments will be mostly directed at solving short term pain points utilities are facing and will try to leverage as much as possible the data generated.

Having said that, many difficult questions remain: who is to finance the capital investments needed to maintain and upgrade the grid now that the value of the energy in transit is decreasing? What is the right balance between operational cost savings, system stability and reliability on the one hand, and revenue and margin improvement on the other hand?

One thing however is clear: whoever doesn’t take advantage of the massive data generated will be left behind. A 2016 Bain & Co study suggested a doubling or tripling the average number of variables taken into consideration to build a model predicting power outages, for instance, will improve the accuracy of events predicted by 2–3X. In turn, this will reduce the cost and the negative impact on customers from dealing with such outages by an equivalent proportion. If you consider more data in dealing with your business problems, you will have a quantifiable positive impact on your ROI. And the smarter you are about getting value out of data, the higher will be your ROI.

Troubles ahead for incumbents and role of innovative startup companies

As massive new investments from utilities and incumbents are focusing on deploying sensors (including smart meters) everywhere and supporting renewable generation capacity, we see a fundamental flaw in this approach: it doesn’t consider the customers’ needs as the main focus.

Failing to address customer centricity could be life threatening for utilities and incumbent players in the future. On the one hand, they enjoy increasing limited barrier at the entry and on the other hand, customers have a deeper awareness about the choices they have. To understand customers’ needs better, utilities and incumbent operators must master data analytics and switch their approach from looking inward to outward. This is no small task, but the good news is that many startups can offer inspiration.

Inspiring startups

Startup companies are disrupting the entire energy value chain. Some are taking advantage of new revenue streams available from flexibility options[5], such as KiwiPowerUpsideEnergy in the UK, or Restore in Belgium. Others are playing a pivotal role to build a new form of “collaborative” or “social” utility such as Sonnen or Lumenaza in Germany where individuals are exchanging the energy produced by their local distributed energy asset (solar PV especially). Finally, “independent energy retailers” are focusing on serving the final customer. Examples include FirstUtility in the UK, Lichtblick in Germany, or DirectEnergie and Ekwateur (an Aster portfolio company) in France. These companies are experiencing dramatic growth and are beginning to have a visible impact on major utilities’ market shares and, in turn, their P&Ls.

However, startup companies are also supporting incumbents to become smarter “customer oriented” players. Some are focusing purely on making data available, for instance to make it simple for utilities to publish data. This can include making it available through APIs as well as building applications which leverage the data. One such company in Aster’s portfolio is Opendatasoft. Others like Germany-based Thermondo are revolutionizing the sales process based on pure automation and data driven insights. They are currently focused on heating systems but plan to add solar PV soon, Other startups, such as Enbala in the US , Greencom Networks and Younicos (recently sold to Aggreko for $52m) in Germany, are positioned as providers of solutions that help utilities to tap into new revenues opportunities generate by an optimized electricity grid (Transactive Energy paradigm[6]).

Conclusion

We predict challenging times for incumbent players for the foreseeable future. However, some are learning the hard way and are taking action sooner through investing or partnering directly or indirectly with startups.

In the future, markets will be far more open and liberalized, the cost of producing electricity from renewables could possibly approach zero, and the value of existing traditional assets will need to be mostly written off. The survivors, whether incumbents or new entrants, will be the ones that master customer relationships and deliver valuable services through a deep understanding of customer’s needs. This will go mostly through mastering data collection and analytics.

In the meantime, the smart grid must serve new needs. The role of delivering high power to energy intense industries, integrating new utility-scale distributed generation facilities, and as well as complementing self-consumption for commercial and especially residential users will all need to be accomplished simultaneously.

Utilities and incumbent players will need to increase internal efficiency to face times of increasingly volatile supply and demand patterns with far more intelligence than they seem to have today. The reward will be substantial for those that are able to accomplish that and could probably make utilities survive for the next decade — maybe, until radical new paradigm changes might appear such as for instance the energy blockchain.

[1] https://www.lazard.com/perspective/levelized-cost-of-energy-analysis-100/

[2] This is in fact an average, as stationary batteries tend to be more expensive (in the range of 300$/kWh) while mobile application batteries are in general cheaper (around $150/kWh) : http://www.mckinsey.com/global-themes/digital-disruption/harnessing-automation-for-a-future-that-works

[3]http://www.businesswire.com/news/home/20170614005185/en/Worldwide-Spending-Internet-Things-Forecast-React-1.4

[4] IDC, Gartner, BCG; data are from 2016.

[5] Demand Side Management, Demand Response.

[6] http://www.gridwiseac.org/pdfs/te_framework_report_pnnl-22946.pdf

A favorable demographic and legislative backdrop for emerging innovations in the field of accessibility

Prospects opened up by advances in geolocation

Our French entrepreneurs, most of whom are young, are bursting with ideas for improving accessibility in daily life. We would like these initiatives to achieve sufficient critical mass for their viability and their profitability — possibly with the support of mobility stakeholders — so that public transportation is not just mass transportation, but rather transportation for all.
  • Fragmentation on both sides of the platform: the average carrier company in Europe or in the US owns 3 trucks, and the number of companies that use freight services at least one time a year is huge.
  • An inefficient market: the load factor in the logistics sector (the average load of trucks / capacity) is lower than 70%. Launching an UberPool service for trucks is one way to solve that.
  • A global problem/opportunity. And a growing problem, as the road logistics industry is expected to triple by 2050, especially driven by the growth in emerging markets who face a lack of sea/air/rail infrastructure.
  • For B2B companies, an instant service at the best price is not the primary driver. The primary driver is to be able to plan the delivery of goods at a defined price, and to be sure that the goods will safely arrive on time. I can wait for an Uber, but an industrial company cannot stop the production line because the missing parts were waiting for the end of the surge-pricing period before being shipped to maintain margins.
  • As recently confirmed by the founders of Palleter (a European Uber for trucks) in an insightful Medium post, truck drivers’ priority is to deliver goods on-time, not to increase capacity factors. It makes sense at the system level for drivers to pick up extra loads to generate extra revenues, but few drivers really want to do so because the risk of disturbing their operations (e.g. delaying delivery, requiring repacking of the truck) is higher than the reward.

Don’t get us wrong: there IS opportunity for “Uber for trucking” startups

How technology will affect the logistics value chain

Who is leading the transformation in logistics?

Drivers: fall of renewables and energy storage costs, ubiquitous digital capabilities

Energy Communities

Impact on the Energy value chain