Sunday, March 29, 2020

COVID-19's exponential growth

I came across a neat video https://www.youtube.com/watch?v=54XLXg4fYsc&t=362s and thought I would summarize the results in a quick-to-digest format. Skip ahead to see the math behind this plot.


I find it very remarkable how, despite our cultural differences, all people of the world can be fit to this universal exponential curve. When it comes to this virus, we are all the same.

The next thing to note is that once the virus is contained, the exponential growth comes to a grinding halt and the number of new cases per day plummets. Each data point represents one day, so you can see that within a week or two, new cases can plummet relative to exponential growth.

What is clear is that China has done the best job in tackling the virus quickly, South Korea is a close second. The USA and Italy still have some work to do before they contain the virus. I hope they don't get to x=1 (everyone infected). Rather, I would like to see the daily new infections plummet well before we get to that point! 


The point of no return?

China and South Korea performed large efforts to contain the virus, and were successful. For the US or Italy to contain the virus will taken a proportionately even larger effort at this point, since they have already outgrown the Asian case numbers. It may be past the point of no return, where the efforts required to contain the virus (millions of daily tests and even more strict lockdown) are not possible anymore. If that's true, I have a hard time seeing what will stop the virus from hitting x~=1 (we are all infected). I hope it will stop before it gets there, but without a full lockdown I don't see how that will happen. Maybe the virus will just lose steam once enough people properly implement social distancing. It is not the kind of experiment that I would voluntarily sign up to be a part of, that's for sure!

The math

On the x-axis is cumulative cases, normalized by population, and on the y-axis is daily new cases, normalized by population.

Because exponential growth has the property that:


We get the funny property that when plotted on a log-log plot, the daily new number of cases (the derivative of cumulative cases vs time) as compared to the cumulative number of cases will be equal to a constant:

When we eyeball the world countries on the above graph, we see that 1/τ is equal to approximately 0.18. Thus, the virus is growing by a factor of e, every 1/0.18 = τ = 5.4 days. Or equivalently, the number of infections is doubling every τ*ln(2) = 3.8 days. That seems like a universal number for all humans.

data from here:

New York

I included New York in the same graph as the nations above, and it's clear that they are proportionately getting hit the hardest. New York is approaching a population-wide 1% infection rate, approximately 10x the national average, although it also fits the universal COVID-19 line, just like all the nations do. The only thing that makes nations different is at what point they break the exponential growth curve.




The big question still remains: will the virus get close to 100% in any nation or city's population? I am worried that it will, but I hope not!

What it would take to cure the world of COVID19

We are currently in the worst possible time in the USA for COVID19 infections. Every day, the number of daily infections is climbing at an exponential rate. However, I want to try to be optimistic. What are some of the best possible scenarios? Well, let's start with reality, then look at the worst-case, and then claw our way back to an optimistic scenario.

Reality is: China and South Korea have contained the virus. The US and many other countries have not, and as a result, the virus is running rampant. Due to the infection now being present in most countries, some of whose leaders (such as Bolsanaro of Brazil) are promoting public interactions and not even attempting containment, the possibility of worldwide containment is a near impossibility.  So based on this, I believe that the COVID19 virus will be around on planet Earth for a while, at least until a vaccine has been made. Within the US, the disease is now so prevalent, that containment seems impossible (https://jamanetwork.com/journals/jama/fullarticle/2763187). However, it is difficult to accept this reality. The difficulty in abandoning the idea of containment was expressed by a comment on the above article: "No! While mitigation must be done also, containment must not be abandoned! China has already shown that containment can be done at scale, with aggressive tracking, testing and isolation."


So that brings us to:

Optimistic scenario #1: the US aggressively pursues containment

The idea that we can start testing everyone, implement contact tracing, and the other steps of containment in the USA, when there are already 100,000 positive cases (and growing with 20,000 new cases per day), and currently little to no contact tracing infrastructure, seems unrealistic to me. However, let's try to stay positive. To get this done, President Trump would need to mobilize a WW2 level of effort and come out and say something like:

"America and the rest of the world are under attack by the COVID19 virus. In World War 2, American troops battled around the world to stop the Nazis. Now, we are going to fight around the world to stop the Coronavirus. We are going to lead the world in containment efforts. First, we are going to beat the virus in America, and then we are going to beat the virus in the rest of the world. Starting today, I am authorizing the FBI, CIA, and local police forces to have 5 experienced officers on every new COVID19 case, to find out everyone that they have been in contact with for the past 14 days, and they will all be on self-quarantine for 30 days. There will be a $10,000 fine for with-holding information, and possibly a one year prison sentence, or both. Those that participate in the self-quarantine program will receive their normal salary and any losses up to $10,000 covered by the federal government. Next, under the Wartime Production Act, and starting immediately, the Department of Defence is placing an order for 1 million rapid COVID-19 tests per day, provided by Abbott and other large biotech companies, which we will distribute to the states that are most in need. Finally, I am urging every American to stay at home for the next 60 days. If we work together, we can beat this virus. Once we beat the virus in America, we will beat the virus overseas also"

Then after the virus is beaten back and normalcy restored to the US, trade can be re-opened with countries that have taken similar measures. Then, to normalize trade relations, the US can send the cavalry (millions of tests and support with contact tracing) to countries in need.

Do I think the above containment scenario is realistic? No. But I like to dream big.

If we execute optimistic scenario #1 above, then we can keep the containment train running with allied countries, sending aid in the form of medical equipment and diagnostic tests, and possible contact-tracing support. Once a country has gotten rid of COVID-19, we can again have normal travel between the countries. There is this weird scenario possible where one country has contained the virus while another has not. For example, if South Korea has contained the virus but Brazil has not, I don't see how South Korea can let Brazilians into their country. They will effectively have to be an entire country in a plastic bubble.


Optimistic Scenario 2: In the long-run we will rid the world of COVID-19

For the next 18 months, we will have to deal with COVID19, until a vaccine is developed. Then, after many years of effort from some dedicated individuals and groups who makes it their mission to eradicate COVID19, hunting down every last pocket of infection, I believe it will eventually be eradicated, just like two strains of Polio have been (https://www.vox.com/2019/10/24/20930553/polio-outbreak-2019-eradication-who)

So in the long-term, we will beat this damned virus.

Optimistic Scenario 3: We can do local containment in the USA while other areas do mitigation.

The most recent modelling suggests that COVID19 will infect most of the population is left unchecked:

In the paper (not peer-reviewed), they show that if left unchecked, we will get 2.9 million deaths in the USA. However, if we do a triggered suppression (engage in a lockdown and extensive contact tracing) when the death rate increases above some threshold value, we can limit the damage to only 520,000 deaths.

There is another paper (single author, also not peer-reviewed) that is making the rounds that touts a very optimistic scenario of only 81,000 deaths in the USA:
http://www.healthdata.org/sites/default/files/files/research_articles/2020/covid_paper_MEDRXIV-2020-043752v1-Murray.pdf
However, in the paper they model the growth rate as an error function (with no justification), and an arbitrary inflection point beta is chosen, based on empirical fitting. Seems suspect to me.

There seems to be a lot of uncertainty in the number of deaths. Lower-end is 81,000, higher end is 2.9 million.

So what is the optimistic scenario in the medium term? Well, I think it is suppression (as indicated in the paper https://www.imperial.ac.uk/media/imperial-college/medicine/sph/ide/gida-fellowships/Imperial-College-COVID19-Global-Impact-26-03-2020.pdf), but instead of triggering local lockdowns and contact tracing based on death rate, I propose that we can do better if we perform local lockdowns based on infection rate. In that case, I think we can get even further below the 520,000 deaths that are predicted. I'll take a wild guess and say only 200,000 (made up number). In this scenario, individual States would be placed under Federal Lockdown once infections climbed above a certain threshold. Until then, infections would be allowed to grow at the exponential doubling rate of 3 days. So a state with only 1 case, would turn into the threshold of 1000 within a matter of 3 weeks. Then the state would be locked down for 3 months. Hmm.. now that I think about it, this strategy sucks. So the idea of a middle-ground, where we just beat down pockets of infection when they arise, is basically equivalent to a permanent lockdown for 18 months. In comparison, (optimistic scenario #1) would be easier than this. So basically, option 3 is a shitty option.

Conclusion

The optimistic scenario is: we rapidly scale up testing, and implement containment and contact-tracing in the USA. If watching American movies growing up has taught me anything, beating the odds with a last-minute hail-mary is what the USA is all about. The US has done it before, being one of the last countries to enter WW2, and going for a hail-mary with developing nuclear weapons, but it paid off then, and it can pay off now. The alternative I see is that we spend the next 18 months doing a dance of releasing restrictions and then getting infections until a vaccine is developed. It would be easier to just go all-in now and get the damned virus contained.

Friday, March 27, 2020

Data analysis of COVID 19 stats New York vs USA

Let's look at the COVID19 testing data (taken from https://covidtracking.com/). I took the daily difference of the total, which represents the number of new cases identified per day. This can be seen plotted on a log-linear plot, where exponential growth becomes a straight line.


First, the good news. The US testing has been ramping up at an impressive rate. It seems to be following exponential growth, with the type of scale-up that is unprecedented in most conventional hardware manufacturing. In just the last 2 weeks we've gone from 2,300 tests per day to 85,000.

Now the bad news. It is clear that in the past weeks, the virus has followed an exponential growth trend with frightening consistency, as daily positive tests climbs with a doubling time of ln(2)/0.3= 2.3 days. We can assume that the worst is yet to come, since there is no clear sign of the growth slowing. If this continues, in one week we will have 100,000 new positive cases per day. We can also see that the national average positive test rate is nearly constant, at around 15%.

If we perform this same analysis with New York we can some interesting differences as compared to the national average.



First, testing does not seem to fit the same growth pattern as the national average. For some reason, the number of tests seems to be stalled at around 12,000 per day, despite the fact that New York has around half the total US cases, and total US test capacity is closer to 85,000 per day. So in terms of getting enough tests, it seems like New York is getting screwed on a per capita basis.

The next thing to notice is that the distance between the two curves (equal to their ratio on a log-lin plot) is smaller than the national average. That's because the positive rate in New York is closer to 40% than the national average of 15%. That means that nearly half of people that get a COVID19 test in New York are testing positive. I am not totally sure about the implications of that, but it almost certainly means not nearly enough testing to have any chance of containment. The silver lining is that the number of new positive cases per day seems to be leveling off in New York. However, with the stories coming in of an overwhelmed healthcare system, I wonder if it's because the virus is slowing down, or if testing is falling behind...

Tuesday, December 24, 2019

The limits of computing

There is a lot of talk nowadays about the Singularity, the scary rate of development of intelligent machines, and the future of artificial intelligence. I'd like to discuss these concepts from a physics perspective. Some of the interesting questions to me are: how close to the fundamental limit of computation are modern computers? How close is the human brain? If we extrapolate a "Moore's Law" type curve based on energy efficiency arguments, when does it intersect human capabilities?

It is well known that the fundamental thermodynamic limit for how much energy a single bit requires in an information processing system is kBT ln 2. This is known as the Landauer limit. Actually, I lied. Some theorists argue that the Landauer limit can be exceeded, for example by using quantum spin instead of energy as a store of information, or reversible computing. Nonetheless, let's assume that the Landauer limit holds true, and see what this means for the future of computation. As we'll see, theoretical arguments about exceeding it are analogous to discussing what we're going to do with all the excess clean energy. Spoiler alert: we are nowhere near an excess of clean energy and we are nowhere near the Landauer limit.

So first of all: how close are modern computers to the Landauer limit? In other words, how many watts does one bit erasure require, and how close is this to the Landauer limit? Well, the FLOPS (Floating Point Operations per Second) per watt (Joules per second) = 6x10^9 FLOPS/W. Taking the reciprocal, that's 1.7x10^-10 Joules per FLOP. Let's assume that since a floating point number has 32 bits, that we get 5.2x10^-12 Joules per bit in a modern computer. By comparison, the Landauer limit at room temperature (293 K) is: 2.8*10^-21 Joules per bit. In other words, modern computers are using 5.2x10^-12/2.8*10^-21= 1.8 billion times more energy than the Landauer limit. There's a lot of room at the bottom!!! Even the most energy efficient modern computer is horribly inefficient at doing math. An energy analogy would be like if driving 100 kilometers in a car required the entire world's energy supply.

Redoing the above calculation in terms of FLOPS/Watt for the Landauer limit, we get 2.8*10^-21 J/bit*(32FLOP/bit)*/(1W/1J/s)=9*10^20W/FLOPS, or 10^19FLOPS/W

OK so computers are pretty inefficient. What about the human brain? Well, first of all: how many operations does an average brain do? Well, from what I have been able to find, somewhere between 10^12 and 10^28 FLOPS. Yes, you read that right! https://aiimpacts.org/brain-performance-in-flops/ The estimates for how many computations the human brain can do vary more 16 orders of magnitude. That's insane. 16 orders of magnitude uncertainty is the greatest uncertainty I have ever seen. I'm not even going to try to make an analogy for that one because it's too much uncertainty to even comprehend. I think this brings up an interesting point for scientists around the world: come up with a better way to measure human brain performance!

Despite the craziness above, we can put some upper bounds on performance: the Landauer limit! And we know that the brain uses around 20W. So the upper bound for the number of computational cycles of human brain performance (assuming no reversible computing or other fanciness), is 20W/2.8*10^-21 Joules per bit=7.1*10^21 bits/s. Assuming 32 bits/FLOP, that's 2.2x10^20 FLOPS. Breaking news: scientists discover that the human brain has exceeded the Landauer limit! Just kidding. A more likely explanation is that whoever estimated 10^28 FLOPS made some bad assumptions that got them 8 orders of magnitude above the Landauer limit! Or maybe the human brain inherently does reversible computing or something. My guess is that no, it doesn't. So the human brain is somewhere between 8 orders of magnitude less efficient than the Landauer limit, or 8 orders of magnitude more efficient than the Landauer limit. lolz.

OK back to computers. Koomey's Law suggests that the amount of energy per computing cycle doubles every 2.6 years. If that's true, and given that modern computers are at 1.8 billion times the Landauer limit, we'll hit the Landauer limit in t=2.6 * ln(1.8x10^9)/ln(2) = 79 years.

The energy singularity (when computers can do more work per joule than the human brain) will happen around 2045 if the human brain is 8 orders of magnitude less efficient than the Landauer limit, or 2100 if it's at the Landauer limit, assuming that computers continue to double in energy efficiency every 2.6 years. I predict that they won't. I think that the traditional silicon computer will not continue to double every 2.6 years and we'll need a dramatically improved computer architecture. This type of dramatic redesign will slow down scaling to doubling every 8 years or so, which will push the singularity way back in time.

I represented this on a graph:



The biggest sources of uncertainty are: how many equivalent computational cycles does the human brain perform? and how quickly will FLOPS/W scale as the limits of traditional silicon are reached?

A common criticism of this type of thinking is: "but the human brain doesn't work like a computer! it uses neurons and is good at parallel computing and doesn't use 1's and 0's." Response: actually, the brain is like a computer! A biochemical system, like "is ion channel open?" can be thought of like a bit of information. Similarly, a relevant protein expression level or horomone level can be quantized. All the brain chemistry put together can be thought of as a computing machine, and it can be quantified in terms of FLOPS! Just because computers today do very different things than the brain does today, does not mean they obey different physics, just that their hardware architecture is different!

This is a first draft to please let me know what you think, any feedback, any typos, etc, and I'll update it!

Wednesday, August 7, 2019

Tugfest physics


I received a message from a local news TV anchor:

"Hello, I'm with KWQC TV6 in Davenport, Iowa and am working on a story about a popular annual tug of war across the Mississippi River here in our area.  The tug of war involves a rope that is strung across the Mississippi River between one team in LeClaire, Iowa and the other in Port Byron, Illinois. 
I am wondering if you would be willing to provide some of your expertise regarding the impact that the elevation difference between the two locations has on the outcome of the event.  In other words, if one team is at a lower elevation than the other, does that team have an advantage or disadvantage as a result?  If we take the elevation of 587 feet for LeClaire and 604 feet for Port Byron, can you calculate or in some way scientifically express the impact this could possibly have on the outcome of the tug of war?  Thank you."

This message tickled the curious part of my brain. So here's my attempt to answer it:



1) In physics, we often draw a "free body diagram" of all the relevant forces on a system. I'm assuming that the rope does not touch the water. In this case, the forces acting on the rope look like this:




As far as I can tell, there is the force of the pullers in Le Claire (F_LC), the force of the pullers in Port Byron (F_PB) and the force of gravity acting on the rope (F_g). In the game (sport?) of tug of war, the team that moves the rope wins. So the idea is to exert a larger force than the other side. on level ground, gravity effects the rope in such a way that both sides have the same effect. However, in this scenario, the rope will be hanging in such a way that one side will experience a different force than the other.

2) It is a well-known problem in physics that a suspended rope makes a catenary curve: http://www2.mae.ufl.edu/~uhk/HANGING-ROPE.pdf

3) There is a well-known saying in engineering that "you can't push on a rope" and so we can assume it is under tension at all points, and that it doesn't apply much bending force because it's long and skinny and floppy.

4) We can approximate the system as being in static equilibrium. Although technically a tug-of-war cannot be performed without movement (one side has to pull further than the other), we assume that ignoring inertia is OK since the static forces are more important than the dynamics for a tug-of-war.

5) We can imagine a ghost pulling on the rope at a position symmetric to the pullers at Le Claire.





From the ghost drawing, it's clear that the pullers at Port Byron have to overcome some extra forces due to the weight of the rope between the ghost and the Port Byron location. Because gravity pulls vertically, the only difference will be in the vertical component. This is the tiny green arrow in the drawing below:

In other words, the horizontal forces will be identical, but the Port Byron pullers will have to pull slightly harder, because they have an upward component of force also.

Now you are probably wondering: how much more force will the pullers at Port Byron have to apply? Let's guess that the rope angle is ~10 degrees at Port Byron and let's assume they use Manilla tug of war rope. Here (https://www.ironcompany.com/manila-tug-of-war-rope) I found a tug of war rope that claims to be "1 lb per foot", or as the modern world would say "1.5 kilogram per meter". Now at 10 degrees (made up guess for angle), we can calculate the amount of extra rope between the ghost and the Port Byron pullers using trigonometry (ignoring the bending of the rope for this section for now):

sin(10 degrees) = 5m/L_r, where 5m was the difference in elevation between the two locations and L_r is the lenght of excess rope. Solving for L_r we get: L_r=5m/sin(10 degrees) = 28.8m. To get weight of the rope, we multiply: 1.5kg/m * 28.8m = 43.2kg.

So by my estimations, the Port Byron pullers will have to overcome an additional 43.2kg of vertical force, or technically 423N of excess vertical force. That seems like a lot. However, if we consider that the angle differences are pretty small between the two sides, and all that matters is the absolute magnitude of the force. Let's assume that they are each applying 5000N of horizontal force (I just made up a number because I have no idea how many people are expected at the event). Now the Port Byron pullers will have an extra ~430N of vertical force). Their net force vector will be slightly downwards (due to excess gravity on the rope), which when we ignore the 10 degrees and assume the rope is horizontal before applying the gravity correction, we get a magnitude of ~sqrt(5000^2+430^2)= 5018N, or an extra 18N in the direction of pulling. That's less than one percent difference. I think we can safely ignore the difference in physiology between pulling at 10 degrees from horizontal vs pulling at 11 degrees from horizontal. The human body should be able to do each equally well.

With a better estimate of # of pullers, their physical condition, and type of rope, the estimates could be refined.

OK, I just did some research about the event and found something interesting: there is a pulley used!!!



A pulley is a mechanical device that can change the direction that a rope is pulling in. It is still true that Port Byron will have to overcome more weight of rope and thus all things being equal, Le Claire has an advantage. However, the pulley position is incredibly important. An average tug of war participant is limited by their friction on their feet and their ability to grip the rope. If they can grip the rope sufficiently, and pulleys are allowed, and since a pulley redirects forces, it would be advantageous to put the pulley low to the ground so that they are pulling more upwards than downwards. I'm assuming that pullers can theoretically pull some larger than one multiple of their body weight. Ideally, a trench would be dug for the rope and pulley to sit in, so that pullers are pulling at a downward angle, maximizing friction on their feet and overcoming that limitation. If the pulley could be placed at the bottom of a staircase-slope (e.g. cut some steps into dirt), that could be ideal for maximizing friction with the ground for a given pulley system. That, or make a ladder like structure in the ground. I have illustrated this point with two hypotheticals, one where the pulley is placed high up (e.g. on a tree) - the stupid pulley, and one where the pulley is placed near the ground (the smart pulley):



I'm assuming that the weights of the two teams are equal. In that case, to maximize friction, put the pulley close to the ground. If you can choose your team, choose the heaviest people you can. Also, don't put sand in your pulling pit, that will make it impossible to apply large forces. Use a high-friction material. Ideally something like a metal ladder. 

I have one question that could invalidate my analysis: does the rope fully leave the water? If there is any part of the rope touching the water, then water currents will have a big effect on the forces. I assumed that the rope leaves the water surface after it is taught. If the rope stays in the water, then river currents will determine the winner. OK I looked it up: the rope stays in the water: https://www.wsj.com/articles/iowa-cant-beat-illinois-at-cross-river-tug-of-war-no-matter-what-they-pull-1473187822
So forces of drag from the river currents need to be taken into account. First of all, let's see if either of the locations looks upstream with respect to the other. Well, from a google map search I think it looks like Le Claire is down-river from Port Byron.




So in my assessment, Le Claire has the theoretical advantage in every respect: they have a lower elevation pulling platform, and most importantly, they are downriver.

OK I just found out something ridiculous. The winner is determined by who pulls more slack out of the water?!?! https://qctimes.com/community/bettendorf/port-byron-pulls-out-another-win-at-tug-fest/article_2311b1cd-990a-5d8b-8228-73a60b16ff44.html
So this is not really a tug of war at all. It's a competition to see who can leave their rope with more slack in it before they start pulling. Probably the team with the higher platform will win if all you are measuring is who can pull more slack out of the rope, when river currents is working to pull it back down. So I think that the winning streak of Port Byron could be due to their elevated platform, which allows them to pull more slack out of the rope in a river. It has nothing to do with pulling harder than the other team. Is Port Byron really 5 meters higher? From the images I saw online it doesn't look like that much... On that subject, I'm not sure where the host got the elevation numbers. Maybe from some random point in the towns? Also, what is the elevation of the pulling platforms with respect to river level?

In conclusion: Le Claire has a theoretical advantage in a true tug of war. However, because Tugfest is a "who can pull more slack out of the water" contest and not a true tug of war, Port Byron has an advantage because an elevated platform will have more slack between loose and taught rope conditions. This analysis is based on the numbers provided to me that Port Byron has a 5m taller platform. I'm not sure that's accurate!!!

If you have additional ideas to improve the estimates please leave them below!

Monday, December 25, 2017

Bitcoin vs Bitcoin Cash


Bitcoin transaction fees are too damn high!

The rising transaction fees for Bitcoin have caused a major split, leaving the Bitcoin community divided into two camps: Bitcoin (BTC) and Bitcoin Cash (BCH). The two cryptocurrencies have unique ideas for how to reduce transaction fees: increase the blocksize, or build a new network on top of the blockchain.

The Bitcoin Cash camp is making the blocksize bigger (from 1MB up to 8MB). This results in much lower transaction fees as compared to Bitcoin. However, the problem with the Bitcoin Cash approach is that it's only a matter of time until the blocksize needs to be increased again. So it's a bandaid solution. In the eventual limit of gigantic blocksizes (give it a few years), this means that the blocks will be so large that average person will no longer be able to validate the blockchain. The current blocksize is 1MB/6 minutes ~= 87GB/year ~=150GB total as of now. If you make it larger, then an average harddrive will no longer be able to download and validate the blockchain. If you extrapolate this pattern to include the types of transaction volumes that VISA handles, then eventually only one giant corporation (Google?) will have a harddrive big enough to validate the blockchain. So this scenario ends in: "trust Google, your money is safe" or perhaps people will switch to an alternative currency that they trust more. But then again, human beings have been known to engage in some moral hazard in exchange for short-term monetary gains. Nobody lives forever, so pushing out the problem to the future may be OK. Maybe a band-aid is good enough.

The Bitcoin camp has taken a more creative approach than simply increasing the blocksize, and is building an entirely new transaction layer on top of the blockchain, that they call "The Lightning Network." The Lightning Network will allow people with Bitcoin to open a channel, and then do lots of little microtransactions outside of the blockchain, without having to pay a fee for every little transaction. It is only when they open and close a channel that the main Bitcoin blockchain learns about the final transaction, and so the fees are significantly reduced. This new system will allow for orders of magnitude more transactions, and therefore The Lightning Network is certainly not a band-aid solution. I would argue that The Lightning Network has the potential to be the killer app that the blockchain has been waiting for. The Lightning Network is in its infancy, and growth can be tracked here: https://explorer.acinq.co

In the future, you will be able open an account with a deposit of Bitcoin (with a small fee), and then make payments with zero fees to anyone else that is a member of The Lightning Network (instantly). This will enable the microtransactions that have been the promise of Bitcoin for the last 7 years. And no, you won't need to provide your driver's license, address, passport, date of birth, and mother's maiden name to open a Lightning account! There is even an app that lets you try the test version of the lightning network: https://play.google.com/store/apps/details?id=fr.acinq.eclair.wallet

Nobody knows which camp will win (Bitcoin or Bitcoin Cash), but odds are currently ~80% in favor of Bitcoin and the Lightning network, and ~20% in favor of Bitcoin Cash and increased blocksize (https://www.coinexchange.io/market/BCH/BTC).

As of the writing of this article, 1 BTC = $15,340 USD

Sunday, December 3, 2017

A bold strategy for The Government of Venezuela: buy Bitcoin

The promise of eliminating runaway inflation by irresponsible governments was one of the major reasons I became interested in Bitcoin. I always imagined that a country like Venezuela or Zimbabwe would be the first to switch away from their fiat currency and adopt Bitcoin. Today, Venezuela announced that they are trying to launch their own cryptocurrency, the Petro. I imagine President Maduro saying: "I have an idea: let's combine all the security risks of cryptocurrencies (theft, lost wallets, etc) with the monetary policy of the Government of Venezuela." To him I say: good luck with that!

I have an alternative. The Govt of Venezuela should buy large amounts of Bitcoin. If they put 15% of their resources into buying bitcoins and another 15% into mining bitcoins, they would become the first country to have BTC as their official currency. They have the power to make Bitcoin price skyrocket. They should do this in secret. If they delay the official announcement, then with the backing of a large government, Bitcoin price will skyrocket, and the government that buys at a low price will have made a direct profit on that investment after the official announcement. Venezuela could be a global version of the "the rich get richer." Furthermore, the investment in mining equipment that I am suggesting will be an investment into the future, as future Bitcoin mining will profit more from the transaction fee than they will from the raw currency creation. This would position Venezuela as a global processor of transactions, resulting in a net trade surplus due to skimming profits from international transactions. Ironically, the instability of the Bolivar makes it possible for Venezuela to profit immensely. They could turn their weakness into a strength. A country with a stable currency wouldn't have the political will to attempt such a bold (but in my opinion profitable) strategy.

Instead, what will probably happen is a bunch of Venezuelan politicians will flirt with some ideas and then possibly launch the next Dogecoin that has the same inflation problems as their existing currency and the Government of Venezuela will miss a golden opportunity.