"I made several great connections through your network. In fact, I was able to over fund my project. I also listed with another network that cost 3X as much and the leads were nowhere near as solid as the investors I met through this network. I will definitely only be using this network in the future. "
Posted on August 16, 2018 @ 07:38:00 AM by Paul Meagher
One aspect of driving that I do not like is switching between low beam and high beam light. As a vehicle approaches I can't remember what state my beams were in and the indicator is obscured by the top of my steering wheel. I have to quickly duck my head down to see what the indicator says and then adjust the light beam intensity or not. I don't like taking my eyes off the road when vehicles are close so this is a significant safety issue as well.
Companies developing driverless cars must have figured this out by now? I would love to have a vehicle where I could activate automatic
control of light beam intensity. I suspect that this is already out there somewhere now but I don't research new vehicle technology enough
to know. I did some googling and came across a recent research paper Intelligent Automatic High Beam Light Controller (2018) that proposes a simple, low cost solution:
In order to make the driving at night time a safe experience and more friendly to the other drivers on the road an automatic high beam light controller is needed. This paper presents, a simple, low cost and easy to install, design for an intelligent automatic on/off high beam light controller. The proposed design was implemented using the required hardware and components. The experimental results show that the controller provides the driver with the required automatic control; by turning on and off the high beam light when facing other drivers. Moreover, the system will turn off the high beam light if there is enough lighting on the surrounding environment such as when driving inside cities.
This automatic high beam light controller example illustrates the point that I would like to make about the future of driverless cars. Instead of
expecting driverless cars to arrive on the scene some year in the future in all the new car models, I would suggest that the companies developing
driverless car technology are going to need to figure out how to modularize the different technologies involved (such as automatic high beam light control) and to gradually incorporate these modules into new car models. In other words, there will be a gradual transition to
driverless cars with new car models offering one or more modules that assume control of some aspect of driving such as high beam control.
Another feature might be to automatically stay between the median line and side of the road. It might not be available under low light conditions unless you have the high beam control module.
We often take for granted all the abilities that are required in performing a complex act like driving. The car industry might eventually agree
upon what the list of component abilities are and strive to make their component modules interchangable or reusable across car platforms.
In my opinion, what is interesting about companies developing driverless car technology may not be so much the end game - the fully automated driveless car. Rather, it is the component technologies they are developing and how those component technologies will be gradually released in new car models. I would argue that the roadmap of driverless car technology is to develop modules that automate different aspects of driving, not to fully automate driving in some high tech car of the future.
Posted on January 10, 2018 @ 12:22:00 PM by Paul Meagher
I recently came accross a site called The Next System. The reason was to find an article
by Peter Victor and Tim Jackson called Towards a New Green Economy (November, 2016). I'm still in the process of reading that article.
Alot of the articles in this site are trying to articulate what the "the next system" will be. Much of it focused on what the next economic system will be.
I think it is an interesting question to ponder ... what the next system will be?
Adding the term "system" implies a level of interconnectedness among the components that rules out smaller innovations that affect us in more limited ways. 99% of innovations that are hyped probably fall into that bucket.
So the next system will be interconnected in certain ways that produces some massive, hopefully desired, outcome.
It might also be nice if the next system wasn't 50 years away, the closer in time the better.
One candidate for the next system would be a Solar Panel Economy. A solar panel economy is simply an economy where the solar panel area increases exponentially every year with a Gross Domestic Solar (GDS) index being used instead a Gross Domestic Product (GDP) index to measure the overall health of the material economy.
A solar panel economy may not sound as sexy as a new green economy which is a more abstract goal. What I am proposing is a very specific goal for the next system that would be measured by a specific growth indicator. The indicator may have usefulness only over a certain period to be replaced with some other indicator once we have achieved sufficient growth in solar panel area.
A solar panel economy is one that an increasing number of people can participate in as solar technology continues to get cheaper, better and more accessible.
A solar panel is the specific enabling technology for a larger solar power system that can be installed in homes, apartments with the right orientation, or sheds for motive benefit. It doesn't have to be a big installation and people with cheap outdoor solar lights are arguably slowly easing into that economy. They are gaining some experience and perhaps starting to see other opportunities for the deployment of solar panels.
I'm not suggesting that other sources of renewable energy are not important or significant or shouldn't be included in some calculation of the health of the material economy. Solar panel technology is arguably a bit different in terms of how accessible and deployable it is relative to other renewable technologies and for that reason should be highlighted in some way in how we measure material progress.
Part of what inspired this suggestion is my own tinkering with the main components of a solar power system: a 100 watt solar panel, a charge controller to trickle charge a deep cycle battery, an inverter to convert battery dc power to ac power that also has convenient electrical plugins attached to it, and some wires and car starter cables to interconnect everything. Trying to scale up solar power to just a 100 watt system opens up your eyes to what the next system could be in way that is difficult to appreciate until you have tried to assemble a small scale solar power system. The potential to be "off grid", for example, opens up opportunities for where you might build and what you might build.
Setting up a small scale residential solar power system is possible, it can happen now with off-the-shelf technology, and most people can begin to participate as long as they don't electrocute themselves. For these reasons, perhaps the solar panel economy will be the next system?
This probably sounds like cheer leading for the solar panel industry and to learn some of the downsides you should read the critical activism book Green Illusions (2012) by Ozzie Zehner. I am just putting the solar panel economy idea out there and suggesting that as more people start to assemble the next power system in their homes, apartments and shed and share their experiences then that could spark the solar panel economy that I am predicting is on the near term horizon.
Energy drives everything and when we start changing the energizing system for an economy to solar we should be measuring that growth using a Gross Domestic Solar index and perhaps eventually using GDS as a proxy for the health of the material economy. In the material economy we now require a decoupling of environmental impact from GDP growth that might only be possible with the advent of a solar panel economy.
Posted on May 24, 2017 @ 07:41:00 AM by Paul Meagher
Crowdfunding is regarded as a recent evolution in finance, however, there were precursors to crowdfunding and crowdfunding exists in
less conspicuous forms all around us today.
In 1983, the father of Permaculture, Bill Mollison discussed the topic of Self Finance in his Permaculture Design Course lectures.
Bill advised that the way to self-finance a business was to presell the item that your business was going sell. This would give you funds to actually create the item to sell. He cited the example of a startup restaurant that began by preselling a certain number of meals at the would-be
restaurant. Bill noted that when you presell in this way, not all of the presold inventory will actually be consumed, possibly only
half (another benefit of preselling). One marketing technique that Bill promoted was to send hand-written post cards (with postage paid for a response) to a carefully selected set of contacts advising them of your upcoming venture and what they might do to help. Bill advised that this would only work if you had a good product to presell and some convertable moral capital.
One way to look at crowdfunding is as an evolving market with self-finance being one of its precursor forms.
There has been recent legislation to open up the crowdfunding market and as a result we are seeing more participation in this form of
financing. Markets can evolve through regulation just as football, soccer, and rugby evolved from primitive football. Likewise, Crowdfunding can be viewed as an evolution of self finance; with the proviso that self-finance has been around longer and may survive longer in the end.
Regulations require that arbitrary quantities be used to circumscribe what is or is not considered to be crowdfunding. One arbitrary quantity is the minimum number of micro-investors or micro-funders that must participate in a fund raise in order for it to be considered crowdfunded. If you have only 2 to, say 10 funders, that is probably not going to be considered an example of crowdfunding.
You can look up what minimum number they suggest as the threshold but that the number is arbitrary and we may sacrifice a deeper understanding
of self-financing strategies by accepting regulated limits as real limits.
Perhaps all we can say is that the threshold of crowdfunding is crossed when there are alot of funders involved.
Amazon does crowdfunding on behalf of authors when it presells their work in progress. Amazon is not often identified as a crowdfunding platform perhaps because it is more strongly identified with being the premier e-commerce platform.
Online personalities that have lots of facebook, youtube, instagram, twitter followers may be able to use their moral capital to crowdfund upcoming projects. They can try to bring their following to a crowd funding platform to collect fees.
You can squander moral capital if the product you ultimately deliver is sub par. That has happened to me and it will likely not happen a second
time with the same person. It is important to ensure that you have a good product to presell by developing prototypes, minimum viable products,
meals, chapters, etc.. to get feedback so that you can verify that it is a good product before you spend moral capital promoting it.
In this blog I have suggested that crowdfunding might be better viewed as existing on a continuum rather than as a discrete type of financial
innovation. It has precursors in the ways that entrepreneurs have self-financed in the past and as such we can potentially learn about modern day crowdfunding by studying strategies used in the past. We obviously have a layer of social media on top of everything these days and I acknowledge that this has a big influence in how self-finance is done today; but the basic need to have a good product and moral capital to spend arguably apply today just as they did in the past.
Not only does crowdfunding extend back into pre-internet days, it also extends laterally into platforms such as Amazon that can pre-sell books
on behalf of authors. By seeing crowdfunding as existing outside of the official crowdfunding platforms, we open up the possibility of
integrating crowdfunding into popular e-commerce platforms or social media platforms. Because crowdfunding also does not need the internet it can also continue to evolve offline as well.
Update: After posting this proposal I read that Facebook is now allowing its users to create crowdfunding campaigns. I think this supports my thesis that crowdfunding cannot be neatly categorized any more and that it will soon spread into e-commerce and social media platforms, and not just dedicated crowdfunding platforms.
Posted on April 4, 2017 @ 09:57:00 AM by Paul Meagher
I recently received a sales brochure from Stihl because I purchased a few of their gas-powered string trimmers in the past to deal with grass and weeds around my organically grown grape vines. They devoted the front page to the brochure to promoting the 4 to 8 hour continuous runtime of their AR 3000 backpack battery option. This technology is not new (as demonstrated in this 2013 ad below) but the market for it is arguably starting to catch up.
The reason I am mentioning this is not to promote Stihl's offering as other companies such as Husqvarna offer a similiar backpack option that may be better in certain respects (e.g., the power cords may not get in the way as much when operating). Rather, what interests me is this statement from the Stihl brochure:
To no one's surprise, the battery-powered outdoor power equipment market is growing exponentially with each passing year.
If someone were to ask me where we are seeing huge market growth today, I would agree that the "battery-powered outdoor power equipment
market" is a huge area of growth. We are only starting to see some of the innovations that might arise from being able to carry around a longer running and more powerful energy supply on our backs. These back packs are quite expensive and marketed more towards the professional landscaper, however, what happens if the prices come down and the number of gizmos we can plug them into goes up? All the power tool brands are investing heavily into lithium ion battery technology and trying to differentiate themselves on that basis. I am contributing to this growth in my own small way by gradually replacing all my corded tools with their non-corded versions. The main drawback to the battery powered backpack is that it still involves a cord, albeit one not plugged into a wall plug. Having that much power at your disposal, however, means you can operate some more power hungry tools for a longer amount of time making it a more feasible replacement for gas-powered versions.
I do like the idea of running a string trimmer without producing gas fumes, however, I am under no illusion that this is a completely green technology because the power plants that I am getting my energy from are still burning fossil fuels to generate my power. That is why I see these technologies only reaching their true potential when offered in conjunction with local power generation as well - solar charging, wind-powered charging, micro-hydro, etc... This "battery powered outdoor equipment" market may finally give us sufficient reason to invest in renewable energy technologies at the home scale. Maybe you don't want to invest in enough renewable energy capacity to power your home, but you might be willing to invest in enough to recharge the battery packs for the increasing number of battery-powered power tools you own?
Another thing Stihl could develop would be an electric bike that I can plug my back pack into so that I have more reasons to make this investment. Maybe I can run a Stihl coffee pot and hotplate with it on the job site as well? Maybe an exoskeleton attachment to assist me in lifting some heavy logs?
How big is this "battery powered outdoor equipment" market going to be and how will it evolve in the next few years? I don't know the answer to this question but it cannot be ignored if you are an entrepreneur or investor looking for an "exponentially growing market". Maybe you will have to work within the Stihl, Dewalt, Husqvarna, Echo, etc... ecosystems to develop new innovations just like software developers innovate within the Microsoft, Apple, and Google ecosystems. If the companies won't open up their platforms for outside innovation then we'll need an open-source battery-powered platform that will if that is possible with all the licensing and patents involved.
Posted on March 10, 2016 @ 10:19:00 AM by Paul Meagher
This is my third blog related to the book Superforecasters: The Art and Science of Prediction (2015). In my last blog I discussed the importance of updating forecasts rather than just making a forecast and waiting until the forecasted outcome occurs or not. This naturally leads to the question of how we should evaluate our updated forecasts in light of agreements or discrepancies between the predicted and the expected outcomes. That is what this blog will attempt to do.
The forecasting example I have chosen to focus on is predicting what my book expenses will be for 2016. I came up with an exact estimate of $1920 but pointed out that assigning a probability to a point estimate is tricky and not very useful. Instead it is more useful to specify a prediction interval [$1920 +- $60] and assign a probability to how likely it is that the forecasted outcome will fall within that interval (80% probability). Now we have a forecast that is sufficiently specified that we can begin to evaluate our forecasting ability.
We can evaluate our financial forecasting ability in terms of whether the probability we assign to an outcome accurately reflects the level of uncertainty we should have in that outcome. If you assign an outcome a high probability (100%) and it doesn't happen then you should be penalized more than if you assigned it a lower probability (60%). You are overconfident in our forecasting ability and when we score your forecast the math should reflect this. If you assign a high probability to an outcome and the outcome happens, then you shouldn't be penalized very much. The way our scoring system will work is that a higher score is bad and a score close to 0 is good. A high score measures the amount of penalty you incur for a poorly calibrated forecast. To feel the pain of a bad forecast we can multiplying the penalty score by $100 and the result would determine how much money you have to pay out for a bad forecast.
Before I get into the math for assessing how "calibrated" your estimates are, I should point out that this math does not address another aspect of our forecast that we can also evaluate in this case, namely, how good the "resolution" of our forecast is. Currently I am predicting that my 2016 book expenses will be $1920 +- $60, however, as the end of 2016 approaches I might decide to increase the resolution of that forecast to $1920 +- $30 (I might also change the midpoint) if it looks like I am still on track and that my forecast might be only off by the cost of 1 book (rather than 2). When we narrow the range of our financial forecasts and the outcome falls within the range then a scoring system should tell us that we have better resolving power in our forecasts.
The scoring system that I will propose will address calibration and resolution and has the virtue that it is very simple and can be applied using mental arithmetic. Some scoring systems can be so complicated that you need to sit down with a computer to use them. David V. Lindley has a nice discussion of Quadratic Scoring in his book Making Decisions (1991). The way Quadratic Scoring works is that you assign a probability to an outcome and if that outcome happens you score it using the equation (1-p)2 where p is your forecast probability. If the predicted outcome does not happen, then you use the equation p2. In both cases, a number less than 1 will result so Lindley advocates multiplying the value returned by 100.
So, if it turns out that my estimated book expenses for 2016 falls within the interval [$1920 +- $60] and I estimated the probability to be 0.80 (80%) then to compute my penalty for not saying this outcome had a 100% probability, I use the equation (1-p)2 = (1-.8).2 = .22 = 0.04. Now if I multiply that by 100 I get a penalty score of 4. One way to interpret this is that I only have to pay out $4 dollars for my forecast because it was fairly good. Notice that if my probability was .9 (90%) my payout would be even less ($1), but if it was .6 (60%) it would be quite a bit bigger at $36. So not being confident when I should be results in a bigger penalty.
Conversely, if my estimated book expenses for 2016 didn't fall within the interval [$1960 +- $60] and I estimated the probability to be 0.80 (80%) then to compute my penalty I use the second equation which is p2 = .82 = .64. Now multiply this by 100 and I get a penalty score of $64 that I have to payout. If my probability estimate was lower, say .60 (60%), then my penalty would be .62 = .36 x 100 = $36. So if I'm not so confident when I'm wrong that is better than being confident.
The quadratic scoring rule is summarized in this table:
Source: David Lindley, Making Decisions (1991), p. 24
I hope you will agree that the Quadratic Scoring Rule usefully reflects how penalties should be calculated when we compare our forecasted outcomes to actual outcomes. It measures how "calibrated" our probability assignments are to whether the events they predict actually happen. In cases where we are not predicting numerical outcomes this scoring system would be all we need to evaluate the goodness of our forecasts. Our prediction problem, however, is a numerical prediction problem so we also need to concern ourselves with how good the resolution of our forecast is.
Intuitively if our prediction interval is smaller and the actual outcome falls within this range then we consider this a better forecast than one that involves a prediction interval that is wider. My proposal is simply to measure the size of your range and add it to your quadratic score. So if my prediction interval is [$1920 +- $60] with 80% confidence and I am correct then my overall score is 4 (see previous calculation) plus the range which is 120. Lets convert this all to dollars and our overall penalty is $4 + $120 = $124. If we narrow our prediction interval to $1920 +- $30 then we get $4 + $60 = $64 as our penalty score.
In an ideal world we would make exact forecasts (+- 0 as our range) with complete confidence (100%) and the forecasted outcomes would happen exactly as predicted. In this universe our penalty scores would be 0. In the real world, however, our predictions often have calibration or resolution issues so most predictions involve a penalty score to some extent. It might help to think of this as a cost you have to pay to someone because your predictions are not as perfect as they could be.
With this scoring system you can check in on your forecasts at some midway point to see how you are doing. If you update your forecast what you are looking for is a reduced penalty score when you check up on your forecast again. How much your penalty score improves tells you if your updates are on the right track. Generally your penalty scores should go down if you update your forecasts on a regular basis like Superforecasters do. Superforecasters are quite interested in evaluating how their forecasts are progressing and using some simple math like this helps them figure out how well they are doing.
A book that is on my priority list to read is Simple Rules: How to Thrive In a Complex World (2015). They argue that it is often a mistake to use complex rules to solve complex problems (which forecasting problems often are). They document how simple rules are often effective substitutes and can be used more flexibly. It is possible to be more sophisticated in how we evaluate forecasts but this sophistication comes at a price - the inability to quickly and easily evaluate forecasts in the real world. We often don't need extra sophistication if our goal is to easily evaluate forecasts in order to get some useful feedback and produce better forecasts. I would challenge you to come up with a simpler method for evaluating financial forecasts that is as useful.
If you want to learn more about the motivations, applications and techniques for forecasting, I would recommend the open textbook Forecasting: Principles and Practice.
Posted on March 8, 2016 @ 10:39:00 AM by Paul Meagher
In my last blog I started discussing the book Superforecasters: The Art and Science of Prediction (2016) by Philip Tetlock and Dan Gardner. I suggested that financial forecasting is a useful arena in which to hone forecasting skills and I used the example of forecasting my
book expenses for 2016. I estimated that I would purchase 52 books (average of 1 per week) and each book would cost $30 so my overall projected expenses for books in 2016 was $1,560.
It turns out that when I actually tally up all the books I purchased since the beginning of the year until Mar 1, 2016, sum the cost for all of them
and add taxes the amount is $583.43 (I don't generally incur any shipping costs). I purchased 20 books in that period. Average cost per book was $29.17 (which was very close to my
estimate). If I assume that I will spend the same amount over the next 10 months then my forecasted book expenses would be $3500.57. The
difference between my initial estimate of $1,560 and this estimate of $3500.57 is $1940.57. We have quite a discrepancy here.
When you make a forecast that forecast should not be written in stone. It is based upon the best information you had available at the time.
You learn new information and the world changes so you have to adjust your forecasts to incorporate this new information. When superforecasters update their forecasts the change from the previous forecast is generally not a big shift although it can happen. The information up to that point still has some weight in determining what the current forecast should be. Forecasters need to be wary of overreacting to new information by making large changes to their forecast right away.
Likewise in light of the new information that my book expenses could be $3500.57 I have to decide how to incorporate this new information
into my current forecast of $1,560. Because my estimate of the cost per book was quite accurate ($30) the question boils down to whether
I will end up purchasing 116 books instead of the 52 I estimated. Even though I like books I can't see me doubling my current rate of
book buying. I don't expect to keep buying at this rate during the spring/summer as I won't have as much time for reading. So I am inclined to remain close to my original forecast but perhaps bump it up a bit to take into account the hard data I have on how much I spent so far.
Financial forecasting is subject to events happening in the world but it is also subject to a policy decision that will control costs. My
policy decision is to purchase at the rate of 1 book a week however I will also sometimes buy books more impulsively if I'm in a bookstore,
or, as happened last Saturday a local book author was at a seed buying event and I purchased her new Permaculture book. So my model of book purchasing consists of a policy component involving 1 book a week and another "random" component which I'll simply assume amounts to 1
book a month over and above my policy. This will generate a forecast of 64 books this per year at $30 per book with is $1920. So my
forecasted 2016 Book Expenses has moved from $1560 to $1920 as a result of new information about my actual book purchasing costs to date.
I could wait until the end of the year to see how close my forecasted book expenses are to my actual book expenses, but why wait until then?
I might want to check in after 6 months and see where I stand then and adjust my forecast accordingly. After six months my expenses should
be half of $1920 or $960. So I'll check in again at 6 months and see if my expenses are close to this amount. Superforecasters regularly
update their forecasts and will also often do a post-mortem when examining forecast accuracy to figure out what they did right or wrong.
Incorporating feedback in this way helps to improve future forecasting in that domain.
Another way to make forecasts instead of simple point estimates as I have done is to forecast that my book costs will fall within some interval
with a certain probability. So I might say that in 6 months my book expenses will fall within +- 60 dollars of $960 with a probability
of 80%. The two ways I can improve upon my future forecasts is to 1) narrow my range (to +- 30 dollars) and 2) increase my estimate of its
probability (to 90%). One method we can use to score such forecasts is quadratic scoring which penalizes you more for incorrect estimates
that are assigned a high probability (90%) of being true compared to a lower probability of being true (60%). I'll leave the discussion of the math used for quadratic scoring for my next blog.
The purpose of this blog was to discuss the idea that being a better forecaster involves updating your forecast as you assimilate new information rather than just making a forecast and waiting until the forecast date to see if it is correct. Superforecasters update their forecasts regularly, they generally don't overreact by making big shifts from their previous forecasts. They analyze what went right or wrong when a forecast is checked against actual numbers so they can use this feedback to improve their future forecasts. It is hard to assign a probability to a point estimate so we introduced the idea of assigning a probability that the forecasted number would fall within some range. In my next blog we will look at quadratic scoring (or Brier scoring) used to evaluate how good these forecasts are.
Posted on February 25, 2016 @ 09:30:00 AM by Paul Meagher
The digital transformation of industries is ongoing and accelerating. The internet, big data, advanced computing and sensors are combining
to transform that way that things have traditionally been done.
An interesting recent example of this is General Electric which is betting that its' future growth will come from the Industrial Internet:
the convergence of industrial machines, data, and the Internet. General Electric is well known for the industrial machinery it designs
and manufacturers, but now it want's to be known as a digital company that also makes industrial machinery. Hence the new GE slogan
"The digital company. That’s also an industrial company".
A major component of their digital strategy is a software platform they call Predix and that can be connected to different types of industrial machines to gather data that can be stored in the cloud and made sense of using their platform. A major market they are looking at is the digital transformation of the Oil & Gas industry and in the video below you can see their vision for Intelligent Pipeline Systems.
Predicting the future is a tricky business and GE may bomb out on this strategy if the funds are not available to make pipelines intelligent.
On the other hand, when you are as big and influential as GE the future may be more a matter of creation than prediction. And so it is
with many companies contemplating their own digital transformation. You have to try to predict the future but you must also be involved
in creating the digitally transformed version of it you envision.
In this book they identify 4 global trends as the ones to watch out for:
Economic growth is occurring most rapidly in emerging cities in China, India, Asia and Africa. As people continue to move from rural areas to cities and urban clusters, they create demand for infrastructure, goods and services. Multi-nationals like Coca Cola, Frito Lay and Unilever report that most of their revenue growth is a result of growth happening in these markets. When looking for global growth opportunities do research at the city/urban region level rather than the region/country level because it is the cities where growth is most evident.
The effect of technology on business is accelerating. This is no surprise but it does have implications for what needs to be done to keep up to date on technologies and skills in our businesses. It can represent a threat or an opportunity depending on how technology acceleration is managed.
The aging of the population around the world is creating significant problems and opportunities. The world will need to adapt to this new normal with new products and services aimed at the elderly, increasing the retirement age, becoming more resource efficient in the delivery of health care services, re-employing retired workers to retain skills and workforce, etc...
We are becoming more globally interconnected in terms of trade, capital, people and information flow across national boarders. The positive aspect is that it is easier for businesses to expand their markets across borders. A negative aspect is that our business can be disrupted by new entrants outside our borders. Either way it is difficult to ignore this trend and businesses will need to develop strategies to cope with and exploit this increasing global interconnectedness.
Where I would use the book is as a tool for researching trends. If you are doing work on a business plan you will often have to look into the crystal ball and identify trends that affect the viability of your product or service. None of the trends identified in this book constitute radically new insights but what might be new are some of the numbers, graphs, and references assembled to characterize these trends. The book can be paired with the McKinsey Global Institute website as a credible resource for trend information you can use for big picture business planning. Some numbers you may need to obtain from local data sources but the characterization and analysis of big picture trends can be found on management consulting websites like McKinsey Global Institute. It should be noted that a precursor to the book was published on their website that focused specifically on the technology disruption trend.
The book does not spell out the methodology they use to make trend projections and often they seem to be simple linear extrapolations from present trends. An example would be this trend graph showing the ratio of retirees to children.
Alot might happen between now and 2050 to that could change these graphs so we should have some measure of skepticism regarding them. At the same time, those reading business plans often want to see numbers and graphs and simple linear extrapolations from present trends may be as justifiable as other methods of predicting the future.
Posted on December 14, 2015 @ 10:37:00 AM by Paul Meagher
Tony Seba is a Lecturer in Entrepreneurship, Disruption and Clean Energy at Stanford University. He has written a recent book
called Clean Disruption (2014) that advances some of his claims about where the energy and transport industries are headed in the near future.
Tony sees a near term disruption happening in the car industry in the form of electric vehicles with Tesla being the most notable player but
with smart phone companies like Samsung, Apple, and Xiaomi making bold moves to enter the market. It might be surprising that smart phone companies are poised to become major players in the electric vehicle market until you realize that electric vehicles have an order of magnitude less parts than internal combustion engine cars (making it more feasible for smart phone companies to get involved) and that many of the near term trends in the automotive industry involve vehicles essentially becoming "computers on wheels". Vehicles are becoming loaded with sensors, software, and are becoming more connected to the internet. They are also expected to take on other roles besides just passenger
transport such as storing energy from the grid or renewables and distributing energy back to the grid or to wherever it is needed. If we reach the point where we are just riding around in vehicles that drive themselves the functionality the car delivers to the occupants will be quite different than what a vehicle is expected to deliver today. We will want our vehicles to provide more entertainment options, productivity tools, mobile social networking, etc.
Disruption is a bad thing for those whose jobs depend upon the industry remaining the same, such as manufacturers, sellers, and distributors
of internal combustion engine parts. Disruption, however, is also an opportunity for those who get ahead of the curve and find their niche
in the disrupted landscape. If the electronic vehicle disruption comes to pass we will, for example, need some visionaries to figure out how to make it into a sustainable industry as well by figuring out what to do with all the spent lithium batteries. My own prediction, based on circular economics ideas, is that these cars will increasingly be purchased as leased vehicles that the car brands will want to take back so they can remanufacture these batteries, upgrade some sensors and software, and put it back into the market again like we do with refurbished computers.
To learn more about Tony's predictions regarding disruptions we might see in the transport industry you can watch a recent talk he gave on clean disruption in the public and private transport industry. He is also in demand as a speaker and you can find lots of YouTube videos of his talks.
Posted on November 30, 2015 @ 08:44:00 AM by Paul Meagher
This week we will be hearing alot about the Paris climate talks and various solutions.
I'll add to the cacaphony by adding my own two suggestions for where the solution might lie.
One idea that I think should be considered is that the main locus for solutions is not at the country, state/province, or individual level but
at the urban region level. Mayors will have more impact on climate change than other bureaucrats. Why? Cities, and more generally, urban regions (which includes agricultural areas on the fringes) are growing in importance and they have the local means to address climate change issues at a significant enough scale to make a difference. So maybe we should have sent the mayors of our major cities over to Paris rather than other bureaucrats or groups because I think it will be the urban region level that we can take our most significant steps to addressing the causes of climate change. Mayors will need to work to improve our urban ecologies. Individuals can also feel empowered to address climate change at the urban region level and can track their progress using emerging smart city technologies. The urban region level is not too big and not too small to make a difference. The main control knob for adjusting climate is to be found at the urban region level of operations.
The second idea that should be considered is to replace our current make-use-discrard linear economy with a circular economics model. The level of change we need to make as a society needs to be commensurate with the size of the problem we are trying to address. While setting targets for green house gases is an important component of the solution, the actual solution may involve a new model for economics. Circular economics might be that model. I'm impressed with the progress the Ellen MacArthur Foundation has made on defining the field of circular economics and in educating decision makes about it - mostly in Europe so far. Here is a talk by Ellen MacAthur that might whet your appetite to learn more about circular economics. In my opinion circular economics is still a work in progress with very promising idea integration work done so far.
We can combine these two ideas and the suggestion would be that the main way we might address climate change is if urban regions, acting under the direction of Mayors, should move towards managing their regions using a circular economics approach as much as possible. If we can optimize the efficiency of cities in consuming resources via circular economics ideas and techniques then we might have a chance at addressing our green house gas targets.
Posted on June 30, 2015 @ 10:50:00 AM by Paul Meagher
This morning I read an interview with Dennis Meadows called Growing, Growing, Gone: Reaching the Limits. Dennis was co-author of the seminal "Limits to Growth" book. In this interview he expresses his viewpoint on the future and how to deal with it. I found two passages particularly interesting. In one passage he downplays the importance of long term planning as a way to deal with climate change using an interesting white water rafting metaphor:
I think we are now in a situation where it doesn’t make much difference what we want to see happen fifty years from now.
White water rafting provides a useful analogy here. When you are going down the river, most of the time it is placid, but every once in a while, you hit the rapids. When it is placid, you can sit back and think where you want to be, how you should time your journey, where you want to stop for lunch, etc. When you are in the rapids, you focus on the moment, desperately trying to keep your boat upright until you return to quiet waters. During the placid moments, it is very useful to have a discussion about where you want to be tomorrow or the day after. When you are in the rapids, you don’t have the luxury of that kind of discussion. You are trying to survive. Our society has moved into the rapids phase.
Climate change is an example of this. There was a period where we had some possibility of influencing future climate by our decisions about the use of fossil fuels. I think that time has passed. Climate change is increasingly dominated by a set of feedback loops—like the methane cycle and the melting of Arctic ice sheets—which are beyond human control. They have come to be the drivers of the system. The dominant drivers of the system are not people sitting around trying to reach a consensus about which of several different possible outcomes they most prefer.
White water rafting might also be a useful metaphor for thinking about business planning and its relevance to dealing with the day to day issues of starting and running a business. Business planning is like the placid lake, and the rapids are where you have to adapt to what life throws at you. Long range planning cannot be used to navigate through the white water part of your journey, there you have to exercise different skills that are appropriate to dealing with the challenges at hand.
Towards the end of the interview, Dennis discusses why he has become more preoccupied with design for resilience than with sustainable development as a way to deal with what the future may hold.
In my own work, I have shifted from a preoccupation with sustainable development, which is somewhat of an oxymoron, toward the concept of resilience. I think that is the future: to understand how different scales—the household, the community, the school––can structure themselves in a way to become more resilient in the face of the shocks that are inevitable regardless what our goals might be.
You see the climate debate evolving this way. Talk about prevention is on the wane, giving way to talk of adaptation. Adaptation really means resilience. It is about designing actions for dealing with New York City the next time superstorms threaten to paralyze the city or for figuring out what California can do if the current drought continues for many more years, or even decades.
Aspirations and good fortune will get us only so far. Human survival cannot risk reliance on them alone.
Posted on September 10, 2014 @ 08:44:00 AM by Paul Meagher
Lately I've been encountering the term Wicked Problems more often (most recent being a lecture
on Excess Nitrogen). "Wicked" problems are contrasted with "Tame" problems that we
can solve with engineering-type approaches. The reason the distinction may be important is because
we might be easily mislead to believe that a "wicked problem" can be solved with a relatively simple
engineering type solution. For example, that we can solve climate change if each
of us plants 10 trees, or if we tax carbon, or if we radically improve the efficiency of transportation,
buildings, and appliances. If we believe this it might be because we don't distinguish between problems
that are "wicked" versus those that are "tame".
One way in which wicked problems differ from tame problems is that wicked problems have no final solution;
instead, we can do things that improve, mitigate, or worsen the situation but never ultimately solve the
problem. We will not solve the problem of healthcare, for example, once and for all with some clever solution
devised by a group of engineers working at an advanced research lab. That is not to say we can't improve or mitigate issues associated with healthcare, but if someone is supposedly offering "the solution" then that is extremely unlikely.
One reason it is extremely unlikely is because "wicked problems" are characterized by the fact that they are difficult to define/formulate and part of that difficulty arises because there are many different stakeholders involved who have different views on what the problem is. The problem of healthcare, for example, is viewed differently by the public, by nurses, by doctors, by administrators, by government, by insurance companies and so on. These "stakeholders" have legitimate concerns that they all feel need to be addressed by the healthcare system so defining what the problem even is in the first place is very challenging. Again, we can come together to improve or mitigate problems of healthcare but we shouldn't expect to solve "the problem
of healthcare" with one masterstroke.
To make some progress on wicked problems requires that we first recognize the problem as being of the "wicked" type because this will setup the proper set of expectations on what can be done and on how the problem solving process should proceed. To address a "wicked" problem you can't just use engineering type approaches, you have to use approaches that are more attuned to addressing these types of problems. These approaches, however, are not fully worked out and that is why the whole idea of "wicked problems" is becoming more of a topic of exploration and research in universities, the military, governments, international organizations and non-governmental organizations.
What can be said so far is that the discipline that spawned the concept of wicked problems, namely systems thinking, is very relevant because it has tools and ideas that it has evolved to help deal with complex systems problems. Engineering type problem solving is also important because design is critical and engineers of various sorts are tasked with designing solutions for complex problems. There is also the need for people who can coordinate the different stakeholders, make sure their voices are all heard, create an atmosphere of mutual respect and consensus building on approaches that might help improve the situation or mitigate problems but not ultimately solve the overall problem. Internet-related technologies to help do this are an active area of research and development right now and may help change our coping strategies from being more authoritarian to collaborative in nature.
It has been claimed that we are entering into an age where the number of wicked problems we have to deal with are increasing significantly for a variety of reasons. It has also been claimed that we do not just have an
increasing number of "wicked" problems to deal with but also an increasing number of "super wicked" problems that are global in scope. In the face of these problems we might feel like all is lost and we need to start prepping for the doomsday scenario. Perhaps, but another response is to recognize that we can't solve these problems with the enginering type approaches, to be aware that others recognize this as well and are starting to find new ways to coordinate and think about these wicked problems, to recognize that while we will likely not
solve these problems now or ever (i.e., there is no "stopping rule" as they say), that there are "good" and "bad" things we can be doing that can improve or worsen these wicked problems.
So my take home message here is to become attuned to the systems thinking distinction between wicked problems and tame problems and not to think that all problems are of the tame type. There is growing recognition that most of our major societal problems are of the wicked type and that we are using the wrong approaches and interventions to address them. Trying to figure out what the best approaches and interventions are for wicked problems is where we are at now. Those engaged in Social Entrepreneurship are for the most part addressing wicked problems so should be aware of the distinction and some of the newer approaches that are being developed and innovated to address wicked problems.
Posted on August 6, 2014 @ 07:56:00 AM by Paul Meagher
I'm doing some research this morning on a local company called Appleseed Energy. They have recently announced some interesting applications of solar technology. I was aware of the company as local pioneer in solar
and wind installations, but now they appear to be branching out into developing solar applications.
The photo below illustrates some of their recent solar applications.
In the foreground is the solar golf cart which is being tested at a golf course this summer (over 150 holes before requiring a recharge) and a solar shed that I don't know too much about but which also seems like a good idea.
What intrigues me about this company is the possibility that this is just the tip of the iceberg for integrating solar technology into buildings, transportation, appliances and devices of all sorts. We may now be entering into a disruptive cycle where some companies get left behind and some companies emerge because they have the dominant solar version of some machine or device. Just as we now say "there is an app for that", perhaps in 10 years it will be common to say "there is a solar app for that". I'm intrigued by the potential for making money off this "Solar Apps" trend, how these apps will enter the marketplace (golf courses and sheds are great choices), the positive environmental impact it could have, and whether some of these Solar App startups might be the next Google, Amazon, or Facebook of the energy industry. There are also lots of opportunities for smaller companies such as Appleseed Energy to make good profits selling solar app packages to local residential and business customers.
What does it take to be a Solar App designer? What kind of career path might you follow to master Solar App Design? To learn more, I watched a YouTube video of Appleseed Energy cofounder Brian Rose and his wife from 4 yrs ago when Appleseed was just starting up. In this video they discuss their off-grid living arrangement. This is the 3rd video in a 4 part video series. I chose this video simply because it is the one they refer people to from their website to let people know more about them. Appears the company is pivoting towards new opportunties based upon the experience they have gained in the last 4 years doing more traditional wind and solar installations. Living off grid might help solar designers better appreciate the opportunities for integrating solar energy technology into everyday life.
Posted on June 27, 2014 @ 08:00:00 AM by Paul Meagher
As mentioned in my last blog, The Future in 2050, I am reading Reinventing Fire (2011) by Amory Lovins
which discusses how the energy transformation from fossil fuel energies to renewable energies will play out between now and 2050. One of the major areas where the transformation will occur is in the area of
transportation which accounts for the majority of our current fossil fuel usage (with buildings, industry, and electricity generation being the 3 other main areas).
So what might our cars look like in 2050? According to Amory, something like the new BMW i3 which was just released into North America in May 2014.
A more important question than what it looks like on the outside will be what is made off. Here the BMW i3 is currently leading the pack because it is 1) commercially available, and 2) made of advanced carbon-fiber composites that are stronger than steel and significantly lighter than steel. These advanced carbon-fiber composites are used to construct the frame in order to make the car much lighter than current electric vehicles. It is considered an "ultralight" vehicle (although not nearly as light as the Volkswagen XL1).
When you can make a lighter vehicle based upon advanced composites then it means you can use smaller batteries to power it and get more efficiency at the same time. The smaller batteries in turn lead to a
lighter vehicle which opens up other avenues for making powertrain components lighter because they bear less stress. We are still in the early stages of this revolution in ultralight vehicle technology so it is difficult to imagine how light our vehicles will be in the year 2050. The lightness of the vehicle, however, will provide the basis for economical and efficient vehicles run purely off electricity and fuel cells and not reliant upon fossil fuels for powering them. So the transformation away from fossil fuel energy to renewable energy in 2050 will involve a transition to significantly lighter vehicles made possible by advanced composites that will replace steel as the primary structural component in our vehicles.
Some reviews of the BMW i3 miss the point entirely of why this vehicle matters. They evaluate it according to traditional driver metrics like handling, looks, power, interior space and so forth. You have to look at
the structural makeup of the vehicle to see why it matters.
The major factor holding up more widespread deployment of advanced composites is the lack of manufacturing capacity to produce these composites. Once the supply is more readily available then we can expect to see more widespread deployment of vehicles with ultralight structure and components. Again, BMW is leading the way on developing the manufacturing capacity to produce these composites. The Wikipedia page on the BMW i3 discusses the current state of BMW's carbon-fiber manufacturing efforts:
BMW is manufacturing carbon strands that form the basis of the i3's carbon-fiber reinforced plastic bodywork at a new US$100 million plant built in Moses Lake, Washington, using raw material shipped from Japan. This location was selected to take advantage of the abundant hydroelectric power available in this U.S. region because carbon-fiber production requires considerable energy and would otherwise emit much carbon dioxide. Electricity in this region also costs about one-seventh of what it costs in Germany, providing a financially beneficial reason for the Moses Lake location. The carbon fiber is then shipped to Landshut, Germany, where the carbon-fiber reinforced plastic parts are fabricated, and the vehicle assembly line is located in Leipzig.
As of February 2014, BMW was producing an average of 70 cars a day, about half the planned production. The lower production output is being caused by a high defect rate in the carbon parts. The company plans to invest about €100 million in the production of carbon parts in order to solve the supply problems. According to BMW, there were 11,000 orders globally as of January 2014, including 1,200 from U.S. customers. As a result of high demand and the slow production rate, delivery waiting time extends until September 2014.
So large investments are beginning to made in "light weighting" future automobiles and once the manufacturing capability for advanced composites is less buggy and more available, we can expect to see more components of the vehicle becoming lighter, battery systems becoming correspondingly lighter, powertrains becoming lighter and so on. There are some nice positive feedback loops that happen once you start developing lighter and lighter vehicles with structural strength equal to or greater than current automobiles.
Getting cars off fossil fuels and onto electricity and fuel cells will be an important component of the energy transformation between now and 2050 but it still does not eliminate fossil fuels from the equation if our power plants are still burning fossil fuels to create electricity. Obviously, the expectation is that power generation will come almost exclusively from renewable energy sources in 2050 due to the increasingly favorable economics of renewable energy production and islandable microgrid technology that is expected to come on stream by then.
The energy transformation between now and 2050 will be a disruptive time for the energy industry as new players emerge and new renewable and alternative energy based technologies displace fossil fuels as the motive power of the economy. Fossil fuels will not disappear, but they may be reserved for making products like advanced composites, asphalt, plastics, etc.. because they are considered too valuable to be burned.
There are tremendous opportunities for entrepreneurs and investors involved in the great energy transformation that is now unfolding. Rocky Mountain Institute is playing a leadership role in these developments:
Posted on June 24, 2014 @ 10:53:00 AM by Paul Meagher
Systems thinking is a useful tool for trying to predict the future. Systems thinking will not necessarily lead to correct predictions, but if systems models are developed they can help us debate and refine our models in a way that other approaches can't. This is in part because systems models incorporate the idea of reinforcing and balancing loops that drive the evolution of a system. This seems to capture the essentials of the problem of predicting the future, namely, identifying the dominant positive and negative feedback loops and how they interact with various physical "stocks" to determine the evolution of major aspects of the economy over time.
When trying to predict the future there is the issue of how far into the future we want to look. This depends on what problem we are trying to solve. If the problem we are trying to solve is avoiding the meltdown of society due to climate change, peak oil, water scarcity, etc... then the "future" is probably not 2020. The societal shifts required cannot be implemented in that time frame as they will require huge infrastructure investments that will take longer to finance and implement. For example, a renewable energy infrastructure to replace most of the infrastructure that now runs on fossil fuels will not be a 5 year project; it is more likely to be a 35 year project with a 2050 date when the project might be considered largely "completed".
I'm aware of two recent books that use 2050 (or thereabouts) as a reference point for predicting the future. One book is
Reinventing Fire: Bold Business Solutions for the New Energy Era
(2011) by Amory Lovins and the Rocky Mountain Institute. Another is 2052: A Global Forecast for the Next Forty Years (2012) by Jørgen Randers (who was a co-author with Donella and Dennis Meadows on the seminal Limits to Growth book). I recently acquired the Reinventing Fire book and will be picking up the 2052 book from my local library to help me understand what the future circa 2050 might look like.
The Reinventing Fire book tackles the issue of how we will transform our energy systems away from fossil fuels towards renewable
energies. Amory believes this energy transformation will be driven by the increasingly favorable economics of renewable energy,
cost savings through incremental and radical efficiency improvements, and the large amount of profits to be made by free market
capitalism as it exploits the tremendous opportunities that arise as renewable forms of energy displace more expensive
and less available forms of energy (fossil fuels). Amory looks specifically at transportation, buildings, industry, and
electricity generation as the main consumers of fossil fuels and how they will be impacted by the switch to relying more upon
renewal forms of energy. Amory provides a playbook for how the energy transformation will unfold citing many technologies and
design options that exist today which are expected to play important roles in the long transition towards a society powered predominantly by renewable energies.
I'll have more to say about the Reinventing Fire book in future blogs. I'm getting ready to read his chapter on the future of transportation which could be fodder for my next blog. I want to end today's blog by stressing the point that predicting the future involves picking a date in the future that you are interested in predicting an outcome for. It occurred to me that 2050 is as good a date as any and has the benefit of being more realistic in terms of when we might "solve" some current systemic problems like climate change, peaking oil, water scarcity, etc... If we enlarge our timeline for prediction, and concern, then maybe we can adopt a more optimistic and can-do stance towards what humanity can accomplish if given realistic timelines and a timeline that encompasses our children's future prospects. If we expect to solve major social and ecological problems by 2020 then we are likely in for disappointment, but if our timeline extends out to 2050 then the small leveraging steps we take today may not solve our problems by 2020, but they might be properly viewed as a part of a sustainability solution that can yield benefits to our children in 2050.
Notice: The New York Investment Network is owned by
Dealfow Solutions Ltd. The New York Investment Network is part
of a network of sites, the Dealflow Investment Network, that provides a platform
for startups and existing businesses to connect with a combined pool of potential
funders. Dealflow Solutions Ltd. is not a registered broker or dealer and
does not offer investment advice or advice on the raising of capital. The
New York Investment Network does not provide direct funding or make any
recommendations or suggestions to an investor to invest in a particular company.
Nothing on this website should be construed as an offer to sell, a solicitation of an
offer to buy, or a recommendation for any security by Dealflow Solutons Ltd.
or any third party. Dealflow Solutions Ltd. does not take part in the negotiations
or execution of any transaction or deal.
The New York Investment Network does not purchase, sell, negotiate,
execute, take possession or is compensated by securities in any way, or at any time,
nor is it permitted through our platform. We are not an equity crowdfunding platform
or portal. Entrepreneurs and Accredited Investors who wish to use the New York Investment Network
are hereby warned that engaging in private fundraising and funding activities can expose you to
a high risk of fraud, monetary loss, and regulatory scrutiny and to proceed with caution
and professional guidance at all times.