The thing is, we always knew this, but we did it anyway.
Like we know that destroying all insects and and global warning are not sustainable. We know we can't have 7 billion humans eating meat. We know we should let not power concentrate too much into some entities. But we keep it up.
Since we don't react unless something forces us to, I'm starting to believe we need some medium crisis to happen like the covid19 so that a bigger one won't wipe us out in the future.
Sure it’s the teacher. South Korea and China only did as well as they did because H1N1 and SARS were more problematic in those counties. Haven been bitten twice, they revamped their health systems to be able to handle pandemics.
The reason for this global shortage is China was celebrating Chinese New Year, when every production simply stopped. It happens every February. And Covid-19 kicks in and workers are effectively locked in their towns and can not travel. Local governments organized some emergency hires from local but still struggles to meet the demand.
If Covid-19 happened in other months it would be totally different.
Given they didn't have the warning we had, and have a very dense population and hold the manufactures of the entire world, it could have been much worse.
I can't imagine what would have happened if patient zero had been in new york.
Now of course, they handled that the way a dictatorship handles it.
Have you paid any attention to the charts? Even if the numbers aren't truthful they can't be that far from the truth or the discrepancy would be much more obvious than it is. Just take a look at this [0]! And this despite being the first to be hit, and having hundreds of millions of people living in densely packed cities!
From my limited observations, it seems like China (or any city/country/culture for that matter) did well in some ways, and poorly in others. This shouldn't be too surprising, because that's usually the case in any complex endeavor involving large groups of people. What is a bit surprising (well, unless you're a political news junkie), is that it seems like most individuals, or at least very, very many, are only able/willing to see one side or the other, as if they see the world in black and white (to test this theory, try disagreeing with such a person, and observe their reaction).
It seems to me that planet earth, and the various interconnected societies that live upon it, can be viewed as a system, like any other. Vastly more complex than any other system we deal with, but a system nonetheless.
Normally when an undesirable incident occurs in a system, we would get a group of people who have a good understanding of the system (or, systems in general at least) together to perform an appropriate post incident analysis, with a goal of identifying the various root causes, solutions to the causes, and an implementation plan. The space shuttle Challenger disaster [1] is a good example of such an analysis, in that it has many similarities (and at least two particularly noteworthy differences: number of deaths, quality of analysis) to the current pandemic.
It is fairly well known that when the people doing an analysis [2] of a system also happen to be participants within the system being analyzed [3], a number of unusual and undesirable behaviors can manifest (a very worst case example may be that, if the system is too complex, people may not only fail to realize that it is a system that can be analyzed, but may even be vehemently opposed to simple consideration of the notion. Rather, they may insist upon using a far simpler, non-technical (and therefore inaccurate) approach of casting blame according to a combination of personal heuristics[4] and in-group dynamics[5], at times going so far as ostracizing any person who may advocate for a standard engineering approach. For these reasons, outsiders (notable in this case, Richard Feynman) are often included in the analysis team.
I wonder if something useful could be learned from comparing and contrasting these two scenarios.
The Space Shuttle Challenger disaster was a fatal incident in the United States space program that occurred on Tuesday, January 28, 1986, when the Space Shuttle Challenger (OV-099) broke apart 73 seconds into its flight, killing all seven crew members aboard. The crew consisted of five NASA astronauts, one payload specialist, and a civilian schoolteacher.
Investigation
In the aftermath of the accident, NASA was criticized for its lack of openness with the press. The New York Times noted on the day after the accident that "neither Jay Greene, flight director for the ascent, nor any other person in the control room, was made available to the press by the space agency." In the absence of reliable sources, the press turned to speculation; both The New York Times and United Press International ran stories suggesting that a fault with the space shuttle external tank had caused the accident, despite the fact that NASA's internal investigation had quickly focused in on the solid rocket boosters. "The space agency," wrote space reporter William Harwood, "stuck to its policy of strict secrecy about the details of the investigation, an uncharacteristic stance for an agency that long prided itself on openness."
The Presidential Commission on the Space Shuttle Challenger Accident, also known as the Rogers Commission after its chairman, was formed to investigate the disaster. The commission members were Chairman William P. Rogers, Vice Chairman Neil Armstrong, David Acheson, Eugene Covert, Richard Feynman, Robert Hotz, Donald Kutyna, Sally Ride, Robert Rummel, Joseph Sutter, Arthur Walker, Albert Wheelon, and Chuck Yeager. The commission worked for several months and published a report of its findings. It found that the Challenger accident was caused by a failure in the O-rings sealing a joint on the right solid rocket booster, which allowed pressurized hot gases and eventually flame to "blow by" the O-ring and make contact with the adjacent external tank, causing structural failure. The failure of the O-rings was attributed to a faulty design, whose performance could be too easily compromised by factors including the low ambient temperature on the day of launch. The O-rings would not work properly at ambient temperatures below 50 °F (10 °C)- and it was 36 °F (2 °C) on the morning of the launch.
More broadly, the report also considered the contributing causes of the accident. Most salient was the failure of both NASA and Morton-Thiokol to respond adequately to the danger posed by the deficient joint design. Rather than redesigning the joint, they came to define the problem as an acceptable flight risk. The report found that managers at Marshall had known about the flawed design since 1977, but never discussed the problem outside their reporting channels with Thiokol—a flagrant violation of NASA regulations. Even when it became more apparent how serious the flaw was, no one [citation needed] at Marshall considered grounding the shuttles until a fix could be implemented. On the contrary, Marshall managers went as far as to issue and waive six launch constraints related to the O-rings. The report also strongly criticized the decision-making process that led to the launch of Challenger, saying that it was seriously flawed: "failures in communication...resulted in a decision to launch 51-L based on incomplete and sometimes misleading information, a conflict between engineering data and management judgments, and a NASA management structure that permitted internal flight safety problems to bypass key Shuttle managers.
Richard Feynman
One of the commission's members was theoretical physicist Richard Feynman. Feynman, who was then seriously ill with cancer, was reluctant to undertake the job. He did so to find the root cause of the disaster and to speak plainly to the public about his findings. Before going to Washington, D.C., Feynman did his own investigation. He became suspicious about the O-rings. “O-rings show scorching in Clovis check,” he scribbled in his notes. "Once a small hole burns through generates a large hole very fast! Few seconds catastrophic failure."[70] At the start of investigation, fellow members Dr. Sally Ride and General Donald J. Kutyna told Feynman that the O-rings had not been tested at temperatures below 50 °F (10 °C).[71] During a televised hearing, Feynman demonstrated how the O-rings became less resilient and subject to seal failures at ice-cold temperatures by immersing a sample of the material in a glass of ice water. While other members of the Commission met with NASA and supplier top management, Feynman sought out the engineers and technicians for the answers. He was critical of flaws in NASA's "safety culture", so much so that he threatened to remove his name from the report unless it included his personal observations on the reliability of the shuttle, which appeared as Appendix F. In the appendix, he argued that the estimates of reliability offered by NASA management were wildly unrealistic, differing as much as a thousandfold from the estimates of working engineers.
"For a successful technology," he concluded, "reality must take precedence over public relations, for nature cannot be fooled."
U.S. House Committee hearings
The U.S. House Committee on Science and Technology also conducted hearings and, on October 29, 1986, released its own report on the Challenger accident.[75] The committee reviewed the findings of the Rogers Commission as part of its investigation and agreed with the Rogers Commission as to the technical causes of the accident. It differed from the committee in its assessment of the accident's contributing causes: "the Committee feels that the underlying problem which led to the Challenger accident was not poor communication or underlying procedures as implied by the Rogers Commission conclusion. Rather, the fundamental problem was poor technical decision-making over a period of several years by top NASA and contractor personnel, who failed to act decisively to solve the increasingly serious anomalies in the Solid Rocket Booster joints.
Heuristics are simple strategies or mental processes that humans, animals, organizations and even some machines use to quickly form judgments, make decisions, and find solutions to complex problems. Heuristic processes are used to find answers and solutions most likely to work or be correct. This does not mean however, that heuristics are always right. In situations of risk, heuristics face an accuracy-effort trade-off where their simplified decision process leads to reduced accuracy.
Group dynamics is a system of behaviors and psychological processes occurring within a social group (intragroup dynamics), or between social groups (intergroup dynamics). The study of group dynamics can be useful in understanding decision-making behaviour, tracking the spread of diseases in society, creating effective therapy techniques, and following the emergence and popularity of new ideas and technologies. Group dynamics are at the core of understanding racism, sexism, and other forms of social prejudice and discrimination. These applications of the field are studied in psychology, sociology, anthropology, political science, epidemiology, education, social work, business, and communication studies.
The three main factors affecting a team's cohesion (working well together) are: environmental, personal and leadership.
Human beings are complex, inconsistent, and many other things. Put them all into an interactive systems (cities, nations, global, cultures/ideologies) and the complexity increases.
It is easy to imagine this being an opening paragraph in the history books for the WWIII chapter. Large chunks of the world economy are currently on ice and we haven't even seen what horrible economic policies governments are going to use to respond to this in the medium term. I bet that when the dust settles it'll be just like every other crisis where the situation is decidedly different and substantially worse.
The economy is both critical and far too complicated for anyone to understand exactly what is going on at the moment. We could be looking at anything from a temporary wobble that disappears in 6 months to a long-rolling 20 year catastrophe.
Probably we'll go backwards -- the end death count will be compared with normal death counts and be likened to a bad flu season. People are less likely to panic next time, but also less likely to pay attention to things like lockdowns.
I suspect one outcome will be a long term reduction in civil liberties
The US has a government agency called BARDA (Biomedical Advanced Research and Development Authority), that estimated at some point that there are 10000 gaps worldwide when it comes to pandemic prevention. And to close these gaps one would need a budget of $10BN/year. They didn't get that.
Now, after a few trillion down the drain, I think they'll get it.
Like we know that destroying all insects and and global warning are not sustainable. We know we can't have 7 billion humans eating meat. We know we should let not power concentrate too much into some entities. But we keep it up.
Since we don't react unless something forces us to, I'm starting to believe we need some medium crisis to happen like the covid19 so that a bigger one won't wipe us out in the future.