Power & Market
Thanks in part to Trump's bombastic and unpredictable style — but more likely due to his lack of friends in Washington — members of Congress have suddenly realized that maybe, just maybe, it's a bad thing that the President of the United States can unilaterally blow up the world.
And when I say "President of the United States" I don't mean that as a metonym for the US government as in the phrase "Washington today is considering a pact with Mexico."
No, a single specific individual really does have the ability to make that decision and give that order — unimpeded in any way.
This fact — which should daily be regarded by all Americans as an excellent illustration of what a farce "constitutional government" is — is now a topic of debate in Washington. It is now being suggested that some of those alleged "checks and balances" we're always being told about might be applied to the most destructive and apocalyptic power enjoyed by a US government agent.
Congressional lawmakers raised concerns about President Donald Trump's ability to use nuclear weapons during a hearing Capitol Hill Tuesday amid bipartisan anxiety over launch process procedures and indications that the administration has considered the option of a first strike on North Korea.
Members of the Senate foreign affairs committee called into question a decades-old presidential authority to deploy nuclear weapons in what was the first congressional hearing on nuclear authorization in decades.
You read that right. This is the first time Congress has considered the question of a president's nuclear-warmaking prerogatives in decades. Congress, on the other hand, has been quite busy during that time holding hearings about steroid use in sports, and violence on television.
As it stands right now, the president can start a nuclear war all by himself. We're talking about first strike capability here, and not about merely a response to military action by another state.
The LATimes tells us how easy it is:
All he has to do is call in the military officer who carries the “football,” the bulky briefcase containing the nuclear codes, and work through a brief procedure to transmit launch orders to U.S. Strategic Command...There are really no checks and balances,” said Bruce G. Blair, a former nuclear launch control officer who is now a researcher at Princeton University. “The presidency has become a nuclear monarchy.”
"Nuclear dictatorship" probably better captures the reality of the situation.
Thus, all the president has to do is decide — perhaps based on whatever unreliable information the CIA is feeding him — that now is the time to unleash a nuclear holocaust on, say, North Korea. Once the bombers are flying, or once the missiles are launched, of course, we'll then have to hope that none of them are interpreted as threats to major nuclear powers like China and Russia, both of which are right next door.
Indeed, it's this unpredictability of how a nuclear strike might get out of hand has long been a limiting factor on the use of the weapons. During the Vietnam War, for example, using nuclear weapons were discussed as a possible alternative to the failed bombing strategy at the time. The problem strategists encountered was the sheer volume of unpredictable consequences that could result from usage.
The downsides of starting a nuclear conflict are immense, both in terms of global diplomacy, and in terms of actual risk to the American population.
But even with this reality staring us in the face, Washington is so obsessed with maintaining an aggressive military stance, that it's unwilling to seriously consider any limitation on the President.
This why we should expect no real changes out of these Congressional hearings. Not surprisingly, Congress has already taken any meaningful change off the table:
Ultimately, the panel warned against legislative changes to rein in the President's authority to exercise nuclear authority."I think hard cases make bad law, and I think if we were to change the decision-making process in some way because of a distrust of this President, I think that would be an unfortunate precedent," said Brian Mckeon, who previously served as Principal Deputy Under Secretary of Defense for Policy during the Obama administration.
Mark J. Perry writes this week at AEI:
In 1973 when commercial TV in America was an oligopoly of only three major networks (ABC, CBS and NBC), economist Murray Rothbard, presciently predicted in his book For a New Liberty: The Libertarian Manifesto the eventual rise of pay-TV (Netflix, HBO, Showtime, Amazon, etc.).
Perry then quotes this section (which can be found on page 122 of the PDF):
Furthermore, if TV channels become free, privately owned, and independent, the big networks will no longer be able to put pressure upon the FCC to outlaw the effective competition of pay-television. It is only because the FCC has outlawed pay-TV that it has not been able to gain a foothold. “Free TV” is, of course, not truly “free”; the programs are paid for by the advertisers, and the consumer pays by covering the advertising costs in the price of the product he buys. One might ask what difference it makes to the consumer whether he pays the advertising costs indirectly or pays directly for each program he buys. The difference is that these are not the same consumers for the same products. The television advertiser, for example, is always interested in a) gaining the widest possible viewing market; and b) in gaining those particular viewers who will be most susceptible to his message.
Hence, the programs will all be geared to the lowest common denominator in the audience, and particularly to those viewers most susceptible to the message; that is, those viewers who do not read newspapers or magazines, so that the message will not duplicate the ads he sees there. As a result, free-TV programs tend to be unimaginative, bland, and uniform. Pay-TV would mean that each program would search for its own market, and many specialized markets for specialized audiences would develop—just as highly lucrative specialized markets have developed in the magazine and book publishing fields. The quality of programs would be higher and the offerings far more diverse. In fact, the menace of potential pay-TV competition must be great for the networks to lobby for years to keep it suppressed. But, of course, in a truly free market, both forms of television, as well as cable-TV and other forms we cannot yet envision, could and would enter the competition.
Deutsche Bank Strategist Jim Reid suspects that global demographics and other realities may be soon putting the current fiat-money regime to the test. According to Business Insider:
Reid's basic contention is this: The dominance of the fiat currency system since Richard Nixon decoupled gold from the dollar in 1971 "is inherently unstable and prone to high inflation," and an offsetting disinflationary shock that kept it afloat since 1980 is now slowly reversing.
If that's the case, Reid says the fiat currency system — a term which describes any currency whose value is backed by the government that issued it, rather than by a commodity like gold or silver — could be "seriously tested" over the next decade.
But why now?
According to Reid, since the 1970s, many world economies have benefited significantly from a number of deflationary forces. Chief in Reid's mind is "an explosion in the global working-age population" which has led to declines in wages and an ability to produce immense amounts of goods and services at low prices.
(Other deflationary forces, which aren't mentioned in the BI piece, include the technological gains that Alan Greenspan was always so fond on mentioning when he spoke publicly. It's true that labor has increased, but so has the usefulness of capital in making less-expensive goods.)
Reid terms the large growth in the global labor force as a "demographic super cycle" and that any reversal in the cycle "could spell problems for the fiat currency system."
Well, thanks to these deflationary forces, central banks "can respond with familiar tools: More leverage, loose policy, and extensive money-printing."
Thanks to so many factors that are pushing prices downward, central banks can massively expand the money supply and still maintain some semblance of price stability.
Buried in this explanation, of course, is what Austrians have long pointed out about prices: In a modern economy, the natural thing for prices to do is go down. Contrary to the deflation-phobia exhibited by so many economists today, falling prices are a signal of improvements in capital, and possibly of greater access to capital by workers. Neither of these things are a danger to an economy.
Thus, as Reid notes, without so much central-bank money printing, global prices would likely have been declining for the past two or three decades, just as they did during much of the late 19th century in the US when living standards were increasing substantially.
So, while central bank money printers think everything's fine because their price indices show "low" inflation, it is likely that the real cost of money printing has been a beneficial lack of deflation.
In other words, consumers could have benefited from repeated drops in the cost of living in recent decades. But instead, they get mild inflation which robs them of the cheaper goods that would have existed in the absence of central bank meddling. Fortunately for central banks, though, few voters and consumers view things that way, and instead have bought the idea that prices are naturally flat, and thus, an inflation rate of, say, two percent is no big deal.
In reality, voters and consumers should be comparing an inflation rate of 2 percent, not to 0% but to, say, negative 2%. In this scenario, central bank inflation should be viewed not as 2 percent, but as 4 percent. Every year. Compounding.
Reid is now worried that these deflationary factors may be coming to an end, and once it does, central banks won't be able to use their usual tricks. And if that happens, the age of fiat money will be in trouble.
Inside Higher Ed reports that yet another small private college is closing:
Two weeks ago, Memphis College of Art said it would close. Also last month, Grace University, in Nebraska, announced plans to shut down, and Wheelock College announced plans to merge into Boston University.
In another sign of the challenges facing small private colleges without substantial financial resources, St. Gregory's University, in Oklahoma, said Wednesday that it would end operations at the end of the fall semester. The university is a private liberal arts institution about 40 miles from Oklahoma City.
Last summer, Marketwire covered the topic with its article "Why so many small private colleges are in danger of closing" which analyzed how small colleges are having to offer discounts on tuition to get people in the door. Larger institutions, both public and private, aren't have this problem.
The miniscule size of some of these colleges is astounding. The Chronicle of Higher Ed reports:
Of the 1,600 private nonprofit colleges and universities in the United States, almost 30 percent have enrollments of under 1,000 students. And though closings have amounted to less than one percent of private colleges, according to David Warren, president of the National Association of Independent College and Universities, a Moody’s Investors Services report last fall indicated that the pace appears to be increasing. As we know, when one of the more recognizable small institutions is threatened with closure — Sweet Briar, Mills, Antioch — and brought back from the brink, at least temporarily, there follows a flurry of new stories about small colleges and the economic peril they face.
Needless to say, its hard to take advantage of economies of scale with an institution that has only a few hundred students. The overheard costs of old buildings alone must be enormous.
And from a student's point of view, it's hard to see why many of them would want to drop everything for 4 years and move to a small town in the middle of nowhere to attend a tiny college with few resources, and which few people have even heard of outside the surrounding region.
Even worse is the fact that these small private colleges tend to be incredibly expensive. Nowadays, few people have the resources and leisure time to pay $80,000 for an education at a small college in a small town where there are few opportunities for earning income to supplement one's living expenses.
Indeed, many of these colleges have more the feel of a resort rather than a serious educational institution. Many of them are in bucolic settings with old-timey buildings that help one re-enact "the college experience" one sees in television shows and movies. And in the end, for those who earn degrees of little value, such as a women's studies degree, this is essentially what an "education" at these institutions amounts to: a very costly four-year vacation from the realities of the world.
For more savvy consumers of education, of course, a large university in the heart of a metropolitan area makes much more sense. These universities have laboratory resources. They have better faculty. They have access to better internships with businesses and hospitals and for part time jobs that can help pay the bills. And, of course, if one goes to a public urban university (such as IUPUI in Indianapolis) one is likely to leave school with actual job opportunities after paying a mere fraction of the price necessary to attend No-Name U in Tinytown, Illinois.
Moreover, if people stopped blowing tens of thousands of dollars on these schools, we'd hear less about the immense amounts of debt that many students take on and then claim they had to borrow in order to get an "education." A lot of the time, these huge debt levels were taken on to finance four years of not working in a charming small-town atmosphere, all the while claiming such expenses were absolutely necessary.
For all of these reasons, over time, we'll see more and more of these small colleges go away. Once interest rates in student loans start to go up — which is a certainty in the medium- and long-term — these Vacation Colleges are going to look even less attractive than they do now, and they'll become a niche market for the wealthy and/or clueless.
Is it illegal to physically attack another person in Kentucky? If so, it's unclear why there is any need or cause whatsoever for the involvement of federal authorities in Rand Paul's recent altercation with a neighbor at this Kentucky home. Indeed, even if Paul had been murdered, there'd still be no reason to involve federal authorities.
Murder is illegal in every one of the fifty states. It's even illegal in Puerto Rico. But, for the last several decades, there's been a disturbing trend in American criminal justice matters. Everything is being federalized.
This was not always the case, however.
This law provides for a death penalty for killing a member of Congress, a presidential or vice presidential candidate, or a Supreme Court justice, as well as imprisonment up to life for attempting to kill such a person...
The background of this law is interesting. When President John F. Kennedy was assassinated in Dallas in 1963, it was not a federal crime to kill a U.S. president. Had alleged assassin Lee Harvey Oswald been tried, the trial would have taken place in a Texas state court. In 1965, Congress passed a law, 18 U.S.C. 1751, making it a federal crime to kill, kidnap, or assault the President or the Vice President.
In 1968, presidential candidate and U.S. Senator Robert F. Kennedy was assassinated in Los Angeles. That was not a federal crime at the time, and Sirhan Sirhan was convicted in California state court for the murder and sentenced to death. (That sentence was commuted to life in prison in 1972, when that state abolished the death penalty, and Sirhan remains in a California state prison.) In 1971, Congress enacted 18 U.S.C. 351, which extended the protection of the Federal criminal law to members of Congress, paralleling that extended to the President and the Vice President.
This fits well into the usual habit of transferring the business of criminal justice to federal authorities with the effect of further extending federal powers and increasing prosecutorial resources that can be brought to bear against the accused. This federalization helps to further diminish the role of the states in the administration of public policy, to extend the reach — and expense — of federal courts, and to impose the costs of a double-layered legal system on the taxpayers.
Prior to federalizing these laws, had there been ambiguity as to whether it was illegal to murder, kidnap, or commit battery against people? Where the streets running with the blood of murdered federal politicians?
Of course not. These laws do send a valuable message, however. They remind us that there are one set of laws for a special protected class of federal officials, agents, and employees. And there's a second set of laws for the people who merely pay for it all.
In a new report for the TaxPayers' Alliance in the UK, Ben Ramanauskas makes an important point: deficit spending and government debt are moral issues, and not just matters for arcane economic theory. That is, when current voters side with current politicians to drive a government deeper into debt, they hand down a big fat bill to future taxpayers and citizens who have no say in the matter right now:
There are significant moral implications of having a large national debt. Money which is borrowed today will have to be paid back at some point in the future, perhaps by people who are yet to be born. As a result of the profligacy of current governments, a burden will be placed on future generations who will have to pay higher taxes and have less money to spend on essential services. It is one of the defining principles of Parliamentary Supremacy that Parliament cannot bind its successors. The reasoning behind this is that it would be an affront to democracy to allow future generations to be bound by previous generations.However, by having such a high national debt, the government binds future generations and curtails their freedom to choose by ensuring that they will have to spend a significant proportion of their money servicing the debt which also places restrictions on what they can spend their money on, and will also have implications for levels of taxation.
Therefore, increased borrowing will result in a burden being placed on future generations. A high national debt can have numerous negative consequences. For example, a high level of debt can lead to an increase in the yields paid on UK sovereign bonds. This is because if investors believed that the UK’s national debt was so high that it would be at risk of defaulting on its debt or that the country would inflate them away, they would need to be incentivised to purchase the UK’s gilts by high yields. Very high national debt can have a negative impact on economic growth. For example, borrowing can crowd out other investment as investors loan money to the government, rather than to the private sector. Nations typically see growth slow when their debt levels reach 90 percent of GDP, with the median growth rate falling by 1 percent and average growth falling by even more.
Moreover, research focussing on the US has found that raising the Federal deficit has an adverse effect on the economy by reducing private sector investment, economic growth, and employment. As mentioned above, government debt has to be paid. Furthermore, interest payments have to be paid on the debt. This, therefore, places restrictions on government budgets and so diminishes their ability to be able to spend money on essential services. Moreover, in order to repay and service the debt, governments tend to either raise taxes or decide not to lower them. An in depth explanation of the folly of increasing taxes and the benefits associated with tax cuts goes beyond the scope of this paper, and the TaxPayers’ Alliance has written extensively on this topic. However, the evidence is clear that tax increases tend to be harmful for the economy, whereas tax cuts tend to have a positive impact.
Although it may seem an attractive policy to borrow money in order to fund government spending, this is not a sensible approach. Although interest rates are historically low, government borrowing is not free and has to be funded. Furthermore, although there have been other periods in its history when the UK has had a high level of national debt, the socio-economic situation is very different from those periods. It should also be remembered that not only has this money got to eventually be paid back, but that also interest has to be paid on the debt too.
These interest payments represent a significant proportion of government expenditure, and is money which could have been spent on essential public services such as healthcare, education, or provision for the elderly. Moreover, proponents of the idea that the government should take advantage of low interest rates by borrowing more are correct to point out that rates are historically low, but that is precisely the point. They are historically low, and so one should not expect them to remain as such over the coming years and decades. Furthermore, we have seen that even a small increase in rates of one per cent, increases the national debt as a percentage of GDP significantly in the long term. Furthermore, there are serious economic and moral ramifications to increasing national debt. For example, a high national debt can seriously hamper economic growth. Moreover, increasing the national debt places a burden on future generations who will have to pay it back.
As Ramanauskas notes, it's not just a matter of higher bills for government services either. All that extra spending discourages private-sector investment as well, creating a more run-down, more capital-impoverished version of the future than would have otherwise been the case.
In America, at least, this is the legacy of the current Baby Boomer generation, and their parents. They want their Medicare, and their highly-paid government jobs, and federally-subsidized roads, and endless wars fought in the far reaches of the world. But their children and grandchildren will be paying the bill.
Apparently, South African Airways is "on the verge of bankruptcy" and is, according to the BBC, "haemorrhaging cash." Unfortunately for South African taxpayers, SAA is also a state-owned operation, it's deeply in debt, and it may not be able to make payroll in the near future.
Clearly, there's a problem.
James Peron, writing at South Africa's Business Day suggests Ludwig von Mises may have the answer:
The mess that is South African Airways (SAA) is widely known today. What many do not realise is that in 1944, Yale University published a book that laid out the reasons for the mess.
While it is true that Ludwig von Mises’s Bureaucracy does not mention SAA by name, it does dissect the differences between "profit management" and "bureaucratic (or political) management".
Mises argues that under each system of management, there exist incentives. Managers and/or owners respond to those incentives.
Transfer the bureaucrat to a system of "profit management" and his actions will change. Put a businessman in charge of a bureaucratic system of governance and he will act like all the bureaucrats before him. Change the incentives and you change the response.
The key word here, when it comes to incentives is "political." It's not the fact that SAA has a bureaucratic structure. Most large organizations do. The difference between SAA and a private organization is that profit is not the primary motivation — because bailouts and other political solutions can substitute for serving customers in the marketplace. IF SAA ceases to make a profit, as is already the case, it seems, it will continue to exist so long as government agents see fit to continue subsidizing it. In a truly private market, of course, an organization that fails to make a profit — regardless of how "bureaucratic" it is — ceases to exist.