Pages

Tuesday, September 27, 2016

Elections are too important to allow voting system insecurities



With the presidential election just weeks away, some Americans still are concerned about the security of the election process. But don’t worry; America’s always-reliable news media assure us that those concerns are unfounded.

To wit:
~ “No, voter fraud actually isn’t a persistent problem,” says The Washington Post online.
~ “Study Finds No Evidence of Widespread Voter Fraud,” states NBC News.
~ “Republicans’ ‘voter fraud’ false flag: Voter ID laws offer imaginary solutions to imaginary problems,” blares a Salon.com headline.

A great deal of contrary evidence exists, however, some new, some not so new. In 2012 the ACORN voter registration scandal involved Mickey Mouse and Donald Duck turning up on registration forms in Nevada. And, one of the most ridiculous examples of voting irregularity occurred in Washington, DC, in the shadow of the Justice Department where an undercover reporter recorded himself giving his name as Eric Holder, who at the time was the U.S. Attorney General, and being offered a ballot without showing an ID or being questioned about his identity.

The Pew Center on the States found nearly 2 million dead Americans still on the books as active voters; that 2.7 million people were registered in more than one state; and 12 million voter records had incorrect addresses or other discrepancies. All of these are potential fraud opportunities.

The Daily Signal reported on a 2014 Old Dominion University study looking into noncitizen voting and found that “6.4 percent of all noncitizens voted in the 2008 election and 2.2 percent voted in the 2010 midterm elections,” and suggested that this likely helped Democrat Al Franken defeat Republican Norm Coleman by 312 votes for a U.S. Senate seat from Minnesota in 2008.

The group Minnesota Majority investigated claims of voter fraud, comparing criminal records with voter rolls and found 1,099 felons who had voted illegally in that election. National Review reported: “Prosecutors were ultimately able to convict only those who were dumb enough to admit they had knowingly broken the law, and that added up to 177 fraudulent voters. Nine out of ten suspect felon voters contacted by a Minneapolis TV station said they had voted for Franken.” Since Franken’s margin of “victory” was 312, subtracting the 177 admitted fraudulent ballots could not overturn the result.

New York City’s Department of Investigation sent out 63 under-cover investigators posing either as dead people or people who no longer lived in the city. Of those, 61 were cleared to vote. Confronted with this evidence, the City Council decided not to demand accountability from the Board of Elections, but to prosecute the investigators for impersonating voters, according to National Review columnist John Fund.

Just this month CBS4 in Denver reported on an investigation that found numerous examples of dead people voting and other irregularities. It said a Colorado Congressional race was decided by just 121 votes, and an Ohio tax measure was decided by just two votes.

There simply is no question that fraud exists in elections at all levels, and as previously shown, it is significant enough to affect election outcomes.

Despite these and other “irregularities,” certain factions continue to oppose efforts to clean up the problems in all levels of the election system. And state efforts to impose voter ID requirements, one of the best ways to validate potential voters at the polling place, is perhaps the idea that draws the most vociferous opposition.

Opponents of voter ID and other sensible requirements often fall back on the argument that voting is a right for all citizens of legal age, and therefore it ought to be easy to vote, and they claim that requiring a photo ID to vote places a hardship on some citizens.

This argument is defeated by reality: The Washington Examiner listed 24 routine things requiring photo IDs, such as to: buy alcohol and cigarettes, apply for Medicaid/Social Security, purchase a gun, get married, apply for a job or unemployment, drive/buy/rent a car, adopt a pet, visit a casino, hold a rally or protest, buy an "M" rated video game, buy a cell phone, or apply for food stamps and welfare.

But, if failing to require provisions to make the system more secure makes voting easier, that ought to set off warnings, because while it may be easier for legal voters to vote, it also makes it easier for ineligible persons to vote.
One might think that since voting is a critical right, all Americans would want that right protected from infringement by non-legal voters.

Certainly, the U.S. Supreme Court subscribes to this idea, The Court commented on the need for secure elections in United States v. Classic, 313 U.S. 299 at 329 (1941): “Free and honest elections are the very foundation of our republican form of government. Hence any attempt to defile the sanctity of the ballot cannot be viewed with equanimity,” wrote Justice William O. Douglas.

Rhetorical question: Why would any good and honest American oppose efforts to assure that only legal voters are registered to vote and able to cast a ballot in any and every election?

The obvious answer is that an unsecure election process enables cheating for nefarious political reasons.

Tuesday, September 20, 2016

After eight years, Obama’s Energy Secretary visits West Virginia


Those who lived in or near the southern West Virginia and/or southwest Virginia coalfields during the peak of the coal business in the 50s and 60s know that state and local economies thrived because of the tens of thousands of people employed by mining companies and the dozens of companies that supported the industry.

Bluefield, WV’s Norfolk and Western Railway yard was always filled with coal cars, many of them full of the world’s most widely used fossil fuel, that were bound for Norfolk, VA’s port, or ready to be unloaded into trucks for delivery. The rest were empty, heading back into the coalfields to be refilled and brought back for distribution.

They remember the bustling downtown that was the financial, shopping and recreational center of the region’s coalfields, and Bluefield’s population of well over 20,000 residents during the time of peak coal. These are valued memories of the good times.

Today’s population is half that size, and the rail yard is often empty. To those who have seen first-hand the decline of the industry and its effects on local communities, the industry’s decline is a very real and painful thing.

The decline began with natural technological advances, as mechanization gradually began putting hundreds of miners out of work. Over time other forces developed, affecting the industry, including the very recent rise of cheap natural gas. Through all of that, there was always a market for coal.

But the federal government’s assault on coal through excessive environmental regulation, spurred by the hotly debated idea that burning coal pours too much carbon dioxide – a gas essential for life on Earth – into the atmosphere, is the greatest problem. President Barack Obama put this attack into high gear. However, today our air is cleaner than it’s been for 100 years, mostly through evolving technological improvements.

Cloistered away in their comfortable offices in Washington, DC, our public servants frequently have no idea what life is like for those toiling away to pay the taxes that fund their salaries. Perhaps if they got out of Washington more, they would understand the problems they create for the people they serve.

This may be the case with Energy Secretary Ernest Moniz, who at the invitation of Sen. Joe Manchin, D-W.Va., finally visited the state after many invitations over the eight painful years of the Obama administration. But while in the state last week, Moniz suggested there is no war on coal, arguing to the contrary that the Obama administration is working to keep coal as an important part of a low-carbon energy future. He also said that cheap natural gas prices are primarily responsible for coal’s downturn.

The absurd idea that there is no “War on Coal” today would be hilarious, if the reality wasn’t so tragic, and the suggestion that the very recent drop in natural gas prices is the principal reason for coal’s decline is simply false.

This general situation was foretold by Barack Obama back in the 2008 campaign: “So, if somebody wants to build a coal plant, they can — it’s just that it will bankrupt them, because they are going to be charged a huge sum for all that greenhouse gas that’s being emitted,” Obama declared.

Assuming that Moniz has the capacity to recognize the misery the administration for which he works has caused for this region, or really cares about the people affected by its policies, visiting West Virginia much earlier in the administration’s tenure might have made some difference.

Hillary Clinton is on that same path. While campaigning in Ohio earlier this year, she said, “We’re going to put a lot of coal miners and coal companies out of business.” Trying to make that sound better, she said she favored funding to retrain those put out of work, but she didn’t say what kind of jobs and how many of them are currently waiting for trained workers.

Not long thereafter, while campaigning in West Virginia, she was asked about that comment by a tearful out-of-work coal miner, to which she responded that what she meant was that coal job losses will continue, according to the Daily Caller. See the difference?

Obama’s energy policy is like putting a square peg in a round hole. If you want to put a square peg in a round hole, take some time and think it through: You should gradually and gently reshape the square peg so it will comfortably and appropriately fit into the round hole. Obama’s method is to place the peg on top of the hole and beat it with a hammer until enough of the corners are destroyed so that the peg will go into the hole. And even then, it is a poor fit.

Just as horse-drawn wagons and carriages gave way to motorized vehicles when they came to be, coal’s role as a primary fuel would have changed as better methods evolved. Such a process would have been not only more humane and less destructive, but infinitely smarter than what has transpired.

Through the centuries humans solved life’s problems and improved their lives through applied intelligence. Somehow, they managed to do this without Barack Obama and the EPA.

Tuesday, September 13, 2016

The wildly outrageous costs of pharmaceutical drug production



Drug companies – “Big Pharma,” as they are called – are targets in America. Especially with Mylan’s recent EpiPen pricing issue and earlier when Turing’s odious CEO Martin Shkreli raised the price of Daraprim by more than 55 times, from $13.50 per pill to $750 per, and his smug reaction to criticism over that questionable move. There are bad guys in all areas of life, of course, and pharmaceutical companies are no exception. Perhaps these two examples are evidence of bad players at work.

Without getting into the minutiae of either of these situations – and certainly not defending either Mylan or Turing – here is some badly needed and eye-opening information about the business of producing pharmaceuticals.

Making drugs is a business, and like other manufacturers drug producers find something people need or want and produce it. Life-saving drugs, or drugs that improve our health are valuable and needed. Drug companies spend billions of dollars over many years to develop useful, needed pharmaceutical products, improve them so that they will meet or surpass the FDA’s strict standards, and once approved market them.

In June of this year, the American Action Forum released research addressing the process of producing new drugs. The process “is extraordinarily expensive and time consuming,” the article stated. “A Tufts University study found that the average cost to bring just one drug to the market is about $2.6 billion. It takes an average of 15 years from the time a drug developer first begins testing a new formula until it is approved by the FDA. Only 1 in 1,000 drug formulas will ever enter pre-clinical testing, and of those, roughly 8 percent will ultimately receive FDA approval.”

Let’s say PharmX creates 100,000 drug formulas, but only one in a thousand, or 100 of them, gets to pre-clinical testing and only eight will receive FDA approval. PharmX will have invested on average $2.6 billion in each one of the eight. The company has to sell enough of each of those eight drugs to pay for its development, and to have enough left over to finance new research and development, and some profit.

Like other inventors, drug companies patent their products, or receive an exclusivity period. A patent is issued for 20 years from the date of filing, and drug makers usually file early in the development stage to prevent other companies from moving in on their idea. If it takes an average of 15 years to get a drug through approval and to market, the pharmaceutical company has on average only five years to sell enough of the drug to recoup the $2.6 billion in development, approval and marketing costs. At the end of the patent period and/or the exclusivity period, another drug maker might make a generic form of the drug, and sell it for a lot less.

So, when you do the math for a drug with development costs of $2.6 billion, you find that if PharmX charges a dollar a dose, it will have to sell 2.6 billion doses in five years just to break even. If PharmX charges $100 a dose, it will have to sell 26 million doses in five years, just to break even. John LaMattina, senior partner at PureTech venture capital, noted that drug development “is a high-risk, expensive, and long-term endeavor.” Classic understatement.

Another aspect of this issue is when drugs made by US companies cost more at home than they do in other countries, such as Canada. It doesn’t seem right that Canadians can buy American drugs cheaper than Americans can. But what is the drug company supposed to do when the Canadian government, or another government, wants to buy millions of dollars of its product at lower than market price when it is trying to recoup billions in costs? There are likely other drugs made by other companies that treat the same disease that these governments could buy instead, so should the drug company pass up that opportunity, leave the millions of dollars on the table, and perhaps suffer financially as a result, while a competitor sells millions of dollars of its product to these countries at a below-market price?

Another obstacle to manufacturers’ ability to recoup the cost of bringing a new drug to market is that regulations imposed by other countries, perhaps to protect one of their own companies, makes the potential market for sales smaller.

And, despite the rigorous development and testing process required to gain the FDA’s approval that the drug is safe for public use, the required warnings about potential side effects and such that go on product sheets, and the fact that drugs are prescribed by patient’s doctors, drug manufacturers still get sued by patients.

Doing business in the U.S. is a real challenge, with often burdensome and unreasonable regulations and other hurdles that must be negotiated that make producing needed and wanted products and services difficult and expensive.

The more expensive drug production is, the greater the need for high prices. While we would all like lower prices for drugs and healthcare in general, we also want to continue to have companies developing new and better drugs and medical devices.

Wednesday, September 07, 2016

Federal welfare programs give freely and demand little




Americans, it is said, are the most generous people in the world. We give to our friends and neighbors and fellow countrymen when they need help, of course, but we also help those who live thousands of miles away in other countries.

We are quick to provide a “hand up” to Americans in need, to help them over rough spots and get them back on their feet so that they can then take care of themselves. There are those who for various reasons are unable to help themselves, and we don’t mind continuing to provide assistance for them.

The hand up is sometimes called a “safety net,” a device to save those truly in need from falling into despair. But for many the safety net has turned into a hammock, no longer a device to help out in an emergency or time of trouble, but an easy way of life for those who would rather let others provide for them than provide for themselves.

This is sometimes a matter of availing themselves of a good opportunity, while at other times it is a matter of culture: Far too many Americans have been taught through actual experience that it is not so difficult to live off the government and charitable interests.

A friend taught a class in the 80s in a junior high school whose student body had a not-so-good reputation for academic achievement. He told the story about his first six-week grading period, using a grading system that was designed to reward honest effort as much as a grasp of the subject matter to get a passing grade. Of the 37 students in his class, half failed; only a few earned decent grades.

When he asked them how they were going to survive after they grew up and were on their own, if they were unable to get a passing grade in a class designed to guarantee passing if you just made an honest effort, one of the students said: “Well, Mr. Smith, I’m going to do like my parents: be on welfare.” That career choice surprised him, and so did the agreement of many of the other students.

This situation, mirrored in towns and cities across the nation, is the result not of the “hand up” efforts of caring Americans, but of hammock-like government welfare programs, which give much but demand little.

President Lyndon Johnson declared a War on Poverty in the January 1964 State of the Union address. “This administration today, here and now, declares unconditional war on poverty in America,” Johnson stated.

His actual stated goal was not to prop up living standards artificially through an ever-expanding welfare state, but instead to strike “at the causes, not just the consequences of poverty.” Ultimately, he wanted “not only to relieve the symptom of poverty, but to cure it and, above all, to prevent it.” A noble goal, as so many government initiatives are, at least at first.

Twenty years ago, another president pledged to “end welfare as we know it.” On August 22, 1996, President Bill Clinton filled a campaign promise by signing welfare reform, the Personal Responsibility and Work Opportunity Act, into law.

This time there were new wrinkles: after two years of receiving benefits, welfare recipients would be required to work, and incentives were removed that encouraged having children out of wedlock and breaking up families to get benefits. There was also a five-year lifetime limit on total time of receiving benefits without working.

How have these programs worked out? Familyfacts.org reported in 2012, “Total federal and state welfare spending has increased more than 16-fold since 1964. Even since the 1996 welfare reform replaced Aid to Families with Dependent Children (AFDC) with the Temporary Assistance for Needy Families (TANF) program, spending has increased by 76 percent and by more than 20 percent since 2008.”

President Obama, the Washington Examiner reports, “took the Great Recession as an opportunity to get as many households as possible into the food stamp program, an important part of his stimulus package. One result was that the number of able-bodied adults with no children who receive food assistance doubled.”

Because the value of food stamps and welfare payments are looked at as income, the overall poverty rate has not changed much since the War on Poverty began. However, both the number of Americans on welfare and total welfare spending have soared.

The goal should be to reduce both poverty and welfare spending. Two states, Kansas and Maine, have implemented a requirement for able-bodied childless adults to work for food-stamp benefits, and the results are impressive.

In Maine, 80 percent of those affected by the requirement left the food stamp program, and in Kansas, the total of those affected dropped 75 percent very quickly, and 60 percent had work within a year, according to the Examiner.

When it was easy to stay home and collect food benefits, many were happy to do so. But when required to work, these recipients quickly got out of the hammock and went to work, abandoning government support.


People are often content to do as little as possible, but will do what they must.