31 December 2013

Mexican Slavery

I don't think I have ever brought up this topic before.  It is one of the most controversial topics today.  It also turns out that a vast majority of people, on both sides, have no clue what it really entails.  Today, I am going to discuss illegal immigration.

The other day, I was reading about illegal immigration, for a class I was taking.  I forget what the article was about, but I started reading through some of the comments left by readers.  One lady said that she was against illegal immigration, because it takes jobs away from legal American citizens.  She then showed her complete ignorance by stating that she has a friend who would gladly take one of the construction jobs currently held by an illegal Mexican immigrant.  The construction industry in the U.S. is fairly well regulated.  Certifications are required for many types of construction work, and most locales have at least three levels of construction code requirements (Federal, state, city, and/or county).  Even for construction jobs that do not have Federal requirements, many states require that anyone with less than journeyman status work under a journeyman as an apprentice (I did this on a few occasions doing insulating work with a friend).  On average, more than half of construction workers are skilled workers.  If they are Mexican, they are very much legal.  Maybe some shady backwoods construction jobs hire illegal immigrants, but reputable construction companies don't.

So, the question then is, if there are so many illegal immigrants here, what are the jobs that they are taking?  Many are working for small businesses that are not willing to pay minimum wage.  It is pretty hard for an illegal immigrant to complain that they are getting ripped off, when they will certainly be thrown out of the country if they do.  I wonder if this lady's friend is willing to work as a bagger or cashier at a small grocery store, for $2.50 an hour.  That is not where most of the jobs are though.  My research found that a vast majority of illegal immigrants in the U.S. are working in one of two industries.  In Washington state, many work in the logging industry.  In general though, a vast majority work in agriculture.  Now, keep in mind that in the U.S., minimum wage does not apply to most agricultural work.  The vast majority of illegal workers (including those who return to Mexico after harvest season is over) in the U.S. are doing agricultural work, which pays $2 to $4 an hour and does not violate minimum wage.

Now we can say that Mexican immigrants are taking thousands of agricultural jobs from legal U.S. citizens right?  This is actually false.  It also turns out that most Americans will not do agricultural work, even if they cannot find any other work.  One farm workers organization listed jobs for agricultural work with some very large unemployment organizations, and out of tens to hundreds of thousands of unemployed Americans notified of the openings, a grand total of three responded.  These are not jobs that Americans would have if the Mexicans did not do them for cheaper than minimum wage.  These are jobs that Americans refuse to do.  The evidence shows that most unemployed Americans would rather go on welfare than work in agriculture.  The immigrants are not stealing jobs.  They are filling a need for labor that no one else is willing to do.

It gets worse though.  It also turns out that the reason the above mentioned organization of farm workers started posting job openings is that many Mexicans won't do farm work anymore either.  Many of the jobs were filled by Mexicans who live in Mexico and come to the U.S. seasonally to do farm work.  The economy in Mexico has improved somewhat, and now many Mexican farm workers have found that they can get higher quality and better paying work staying in Mexico, than working on farms in the U.S.  One farmer complained that the lack of workers was going to cause part of his berry harvest to go bad before they could be picked.  Similar worries have been heard from other farmers as well.  As the economy in Mexico has improved, jobs on U.S. farms have opened up.  Unfortunately, few Americans (three, specifically) are willing to do that kind of work.  As the Mexican economy continues to improve, we can expect to loose more and more farm workers.  Here is the catch: Farm workers are essential to our ability to produce food.  Now, last time I checked, 60% of the food produced in the U.S. is exported.  So, decreasing production by 60% would not cause us to starve, but it would cause other countries to have food difficulties, and it would force a lot of farms to close, which would eliminate all of the administrative jobs that are held by legal American workers.  Obviously, if food production dropped more than that, people in the U.S. would start to starve.  In short, without Mexican farm workers, we would loose a large number of jobs that Americans are willing to do, we would loose a major export, we would probably have to start rationing food, and some people would starve because they would not be able to afford the new prices for food.

All of this brings up a lot of questions.  The obvious one is, how can we cut down on illegal immigration without destroying our ability to produce enough food for everyone?  The solution is even more obvious: Illegal immigration is only illegal because it is illegal.  Since it is defined by the fact that it is illegal, making it legal would eliminate the problem entirely.  Now, I am not necessarily for opening the borders and walking away.  We could, however, make it much easier for honest Mexicans to immigrate.  We could even just make it easier to get short term work visas for farm workers.

A second, less obvious question this brings up is, why is minimum wage different for agricultural work?  Also, farm workers don't get overtime pay either.  Why?  The idea with minimum wage is that it is designed to make sure that workers get enough wages for their work to pay for living expenses.  The idea with overtime is that more than 8 hours of work a day (or 40 hours a week) is excessive labor, and the employee deserves extra pay for the difficulties caused.  Do farm workers somehow have a lower cost of living, such that lower pay is justified?  Is farm work so easy that overtime should not apply?  The answer is very obviously "no."  If we limit farm workers to legal workers (whom the law is designed for), they have the same cost of living as anyone else.  Working in agriculture does not magically decrease cost of living.  The overtime thing is worse.  The reason Americans won't do farm work is that it is grueling labor.  Farm work is some of the hardest physical labor available in the U.S.  If anyone deserves extra pay for excessive work, it is farm workers.  So, the question "why" still stands.  The answer is, farming is not as profitable as many other businesses (or, it was not, back when this particular part of labor law was designed; modern industrial farming techniques have dramatically increased profitability of farming).  The justification is, this type of business needs the lower minimum wage, and farm workers work 12 to 16 hours a day, almost every day they work.  As such, the cost to pay workers fair wages is supposedly too much for farms to handle.  Does this really justify blatantly ripping them off though?

This reveals an ethical dilemma.  We need food.  In fact, food is one place where normal market profit maximization techniques are obviously wrong.  Profit maximization math balances demand with price.  The problem with doing this for food is that when demand decreases, it means that people are not buying the product, often because they cannot afford it.  For food this means that people starve.  So, if profit is maximized, around 25% of people will not be able to afford food (this be based on common trends with other products; in the U.S., given that a vast majority of people are in the lower class, it might be closer to 50% or more).  Maximizing profit for food forces people to starve.  So, the dilemma is this: Allow the food industry to be entirely capitalistic, require farms to pay fair wages, and have a significant percentage of the population starve, or allow farms to pay wages worth far less than the average compensation given to slaves in the Old South (they at least got room and board), to keep food costs low enough that everyone can afford it?

What this really comes down to is, when is the U.S. really going to outlaw slavery?  Americans recognize that farm work, at current legal wages, is not much different from slavery.  This is part of the reason Americans will not take farm jobs.  Many Mexicans are used to conditions worse than slavery, so seasonal slavery on a farm, that actually pays, is better than what they are used to.  That still does not make it right though.  (Note, if you are sitting there thinking, "It's only slavery if they are forced to work," look up "chattel slavery."  That is what you are comparing it to.  Chattel slavery is not the only kind of slavery.  Voluntary slavery is still slavery.)  We managed to outlaw blatant chattel slavery long ago.  Now we are battling a different kind of slavery that is much more insidious.  Until we recognize it as slavery, the American Dream will be a false hope available only to those who already have it.  Let's free the Mexican slaves!

Poverty and Minimum Wage

I read a study recently about poverty.  It claimed that the legal definition of "poverty level" does not match the legal definition of "poverty" in the U.S.  Specifically, for the government to consider a U.S. citizen or family to be under the poverty level, the household must be making less than somewhere around $20,000 a year (I think the actual figure is within a thousand of $23,000).  The study looked at cost of living all around the U.S.  Because cost of living is different in different places, the poverty level should also be different.  Using the legal definition of "poverty," the study found that the actual poverty rate averages over $30,000 a year.  There were a few places where it was as low as $25,000 a year, but there were many places where it was very close to $40,000 a year.  Given the legal definition of "poverty rate," which is used by many government welfare agencies, there as almost a $20,000 gap where people who  need government welfare cannot get it.  Imagine the dilemma of the guy who is making $19,000 a year, gets all of the food for his family through food stamps, and is paying only half the normal rent in subsidized housing.  What happens when his boss offers him a promotion that adds $5,000 a year to his salary?  If he accepts, he will make $24,000 a year.  Unfortunately, this will not cover his $30,000 a year expenses.  Further, loosing the food stamps takes out $3,000 a year from his income, and loosing qualification for subsidized housing takes out another $5,000 a year.  If he does not take the promotion, he stays at $19,000 a year, but food stamps give him $3,000 for food, and the housing subsidy gives him $5,000 for his family's apartment.  Before the raise, his gross income is actually $27,000, only $3,000 short.  After the raise, his gross income is $24,000.  He would be $6,000 short this way.  The only way he can take a promotion is if his salary is increased by more than $8,000 a year (plus the additional taxes, which he will now have to pay with his increased income).  What are the chances that he will eventually be offered a raise big enough to justify accepting it?  Approximately nothing.  Most employers do not even offer a $3,000 raise for a single promotion.  This guy will probably be stuck barely in poverty ($3,000 short) for the rest of his life, because any raise that would help him progress out of poverty will disqualify him for welfare and cause him and his family to starve and be thrown out on the street.  I guess the American Dream only exists for those who are already rich or the lucky few geniuses who can pull off a massively successful invention and manage to keep the rights long enough to get paid.

This gap between the legal definitions of "poverty" and "poverty level" is not the only gap that causes a major problem.  There is a similar gap between Federal Minimum Wage and the legal definition of "poverty level."  Current Federal Minimum Wage is $7.25 an hour.  Now, the legal work week is only 40 hours, according to Federal Labor Law.  There are only four professions which can legally be required to work more than this (other businesses can ask their employees to work more hours, but if the employees refuse, they cannot do anything that could be construed as punishment or retaliation).  There are also only 52 weeks in a year.  Doing the math, we find that Federal Minimum Wage enforces a yearly wage of only $15,080, for a full, legal work week.  This is only 75% of the Federal definition of "poverty level."  The Federal government itself is practically encouraging entry level businesses to keep their employees in poverty.  Worse, the actual poverty level is closer to $30,000 a year, which means that in reality, the government is allowing businesses to pay only half what a person needs to survive.  This is encouraging an 80 hour work week, with the additional overhead of working for more than one employer.  Further, most minimum wage jobs only schedule employees half time, to avoid paying benefits.  This means that the government is actually encouraging an 80 work week, split between four jobs.  Note that each job has extra overhead for taxes, administration of the businesses, and often extra travel time.  This is absurd.  How stupid do law makers have to be to miss this problem?

So, there are a few possible solutions.  The worst is to increase the legal work week to 80 hours and remove laws requiring benefits.  This would solve all of the problems for the businesses (besides extra hiring costs to replace dead employees).  This also happens to be 16 hours a day, 5 days a week.  The 40 hour Federal limit was placed to solve this problem, and it would be inhuman to change it back.  Also, if benefits are removed, the poverty level should be raised another $5,000.  The second solution is to fix the government definition of "poverty level."  This will allow more people to get on welfare, which will help to mitigate the problem.  Unfortunately, this will cost a lot of money, and if the government cannot get up the nerve to tax large businesses more, it will result in economic disaster.  Note that this solution should be used, but it will not fix the problem by itself.  The third solution is to raise minimum wage such that working full-time in any job will pay enough to stay out of poverty.  This will still not help people that cannot find jobs or that cannot find 40 hours of work each week, but it will take a huge number of people out of poverty.  The ideal solution would be both of the last two.  Fixing the definitions problem will make welfare work for everyone who needs it.  Fixing minimum wage will get a huge number of people off of welfare, which will balance the effect of fixing the definitions.  In this system, the guy above will already be out of poverty, if he is working full-time.  If he is not, he will be able to remain on welfare until he is entirely out of poverty.  This will allow him to accept the promotion without endangering himself and his family.

Now, there will certainly be some negative repercussions to this.  Businesses that base their profit margin on underpaying employees will probably fail (we will loose a lot of fast food places; is that really a bad thing?).  Initially unemployment will increase, which will increase the burden on welfare.  Ultimately though, this problem must be addressed.  If we do not start taking welfare money from the businesses that are causing it to be required, our government will eventually default on its debts, which will destroy its ability to borrow money.  This will force a massive lightening of government, which will probably result in a depression like the world has never seen before.  We might as well start now, where it will actually fix some major problems, instead of waiting until we are on the verge of government collapse.

Because I am black

Just to be clear, I am not black.  The phrase I reference in the title, "It's because I am black" is a phrase that should never be uttered by any black person trying to gain fair treatment.  This particular post was prompted by at least one incident mentioned by Oprah Winfrey.  Evidently on two occasions, when she was in Europe, she was refused service.  She claims that both of these incidents were examples of racism.  The first was in France, where she entered a store several minutes after it had closed, and the store clerk told her he could not serve her because the store was closed.  The second, in Switzerland, was in a store that sold handbags.  When she displayed interest in a 35,000 Franc (about $38,000US) handbag, the clerk suggested some less expensive options.  Oprah claims that both of these were cases of blatant racism.

The first case was clearly not racist.  No matter how racist the store clerk might have been, I find it extremely unlikely that he (or she...) would have acted any differently for a white person, or even a French white person (the French are notorious for being less than polite to foreigners).   The clerk might have been racist (there is no evidence of this), but his actions certainly were not.

The second case is not quite so clearly not racist, but if you consider the facts, there is a plethora of more reasonable explanations.  The first is that the clerk might do this for all customers.  It is possible that shoppers have left the store without looking back when they discovered the prices of the more expensive items.  Offering lower priced alternatives could be a good marketing strategy for retaining less wealthy customers.  While this is probably more likely than racism, it is still not that likely.  A more reasonable explanation is that Oprah was probably wearing a rather out of date style, and the clerk assumed that this meant she could not afford something more recent.  I can suddenly hear a lot of people becoming indignant, maybe saying, "Oprah would never wear something out of style."  Maybe that is true, but it is essential to consider how fashion evolves.  On average, Europe is about 20 years ahead of the U.S. in fashion (much like Japan is about 20 years behind the U.S.).  Oprah could have been wearing the latest U.S. fashion, and she still would have been 20 years behind in Europe.  If the clerk were not used to serving rich Americans, the she would probably have mistaken Oprah's clothing for thrift store clothing donated by some rich person making closet space.  Now, while I think shopping at thrift stores is wise, even for the rich, I could easily see the clerk take this as a sign of being less than very wealthy.  If she was getting a commission, I could easily see why she would suggest something cheaper to someone wearing apparently cheap clothing.  There are several other possible, non-racist, reasons that the clerk might have offered cheaper alternatives, but racism is about the least likely reason.  (Note that the shop owner claims that the event never happened.  It is possible that Oprah was just making up a story for the occasion.  Rich people seem to do that a lot.)

So, now for the point of this article: Crying racism makes black people (or people of any race) look bad.  Oprah is enforcing a stereotype that black people will cry racism whenever they think it will help them get their way.  Sadly, this bad habit of prominent black people has made it cliché to "play the race card" even when legitimate racism is taking place.  What they don't seem to understand is, this encourages racism.  Many employers are afraid to hire black people, because they fear law suits over racism where none exists.  It sometimes seems to be a lower liability to choose not to hire black people than it is to hire qualified black people at the risk of unfounded racism law suits.  Further, it is rather hypocritical to constantly cry racism when you are trying to get equal rights with everyone else.  Maybe these people think that if they constantly point out their race, everyone will magically forget about race.  A strategy that has seemed to work far more consistently is to forget about race entirely and act like you belong.  I have worked with people of different races before, and the most respected are the ones who act insulted at the insult when people are rude, instead of pretending like every little comment is racist.  These people respect others and first assume that rudeness is unintentional, and when that is not possible, they assume the rudeness is directed at them and not their race.  One benefit of this is that even when someone is being racist, it takes race out of the equation.  This makes rudeness more personal, which is not what racist people typically want.  When people are racist, they are directing their rudeness at a group instead of an individual.  This feels less wrong than directing rudeness at an individual.  When the target of the rudeness ignores any racial element, it makes the rudeness personal, which makes racist people uncomfortable.  In other words, acknowledging racist behavior, whether intentional or not, actually encourages racism, while refusing to acknowledge the racial element discourages it.  (Note that this is true of modern society, but it was not in early America.  This is because targeting rudeness at specific racial groups was more socially acceptable then.  When a large number of people support racist behavior, racist people are less likely to feel guilt when the insult is made more personal.  Now, racist behavior is not as socially acceptable, so turning an insult personal is a far more effective way of discouraging racism.)

This is just one way that many American blacks separate themselves from everyone else (not just whites).  Another one (which I hope Oprah is intelligent enough not to be involved with) is gangs.  Black ghetto kids are taught young that blacks are oppressed.  They are taught that they will never be treated as well as whites.  They learn that they can have "families" that will help them, by joining a gang.  They are essentially making their own culture, and they are rejecting American culture.  The problem is first, they are wrong.  Plenty of black people have gotten out of poverty and even slavery.  Some of the most famous black Americans in history started out as slaves (look up George Washington Carver, who invented peanut-butter and was involved in setting up one of the first learning institutions for black people in the U.S.).  Gangs generate crime, and this crime makes black people look bad.  Yes, it is a stereotype, and stereotypes are not always accurate.  If you think you can convince the majority of people to discard stereotypes though, you are delusional.  The single most effective way to destroy racism is to become a functioning part of society, and to prove the stereotypes wrong (actually, this is more of changing the stereotypes).  For ghetto black kids, this means staying out of crime, graduating from high school, and finding a way to go to college (even if it means scholarships and loans; just make sure to pick a profession that will pay off the loans quickly; consider engineering, we need more engineers anyhow).  For Oprah and other prominent black people this means stop pointing out race every time you feel insulted.  If someone insults you, take it personally, do not blame your race (or better yet, blow the insult off; if blacks were stereotyped as extremely difficult to offend, employers would feel more comfortable hiring black people than white people).  If you want to be treated like everyone else, quit pointing out all of the differences.  There are times when discussions about race are appropriate.  When you have just been insulted in a totally non-racial way is not one of those times.

One other thing: The employees at the stores Oprah mentioned have more right to be insulted than she does.  Racism is a serious thing.  Accusing people of racism without substantial evidence is extremely rude.  Imagine if you refused to do something for someone and they accused you of being a pedophile, a porn addict, or a drug addict.  Racism is somewhere among those in seriousness.  Oprah very seriously insulted these two people, and she is lucky she did not cause an international incident.  Famous people have a responsibility to represent their home country in a good light.  Oprah's rather heinous behavior has reinforced another stereotype, that Americans are rude, unfeeling, egotistical jerks.  Thanks Oprah for making both black people and Americans look bad, all in one shot.

20 November 2013

Proud to Work 80 Hours a Week

If you just bragged on Facebook or Twitter that you or your spouse works 80 hours a week just to get by, this is for you.  If you are not that person, read on anyway, and enjoy the show.

Jane McGonigial, a rather well known woman who works in the video game industry and who has started doing research on the effects of games on humans, has given a number of TED talk which I highly recommend watching.  In one of these videos, she describes some research she was involved with.  In this research, a number of nursing home residents were asked what their biggest regrets were.  The two most common answers were, spending too much time working and spending too little time with family.  Someone (who I have been unable to identify) once said something to the effect of, "Any fool can learn from his own mistakes; it takes a wise man to learn from the mistakes of others."  If you are working 80 hours a week, you should be asking yourself, are you going to be the fool who learns from his own mistakes only after it is too late, or are you going to be the wise man (or woman) who learns from the mistakes of a majority of old people who wish they had spent their lives better?

I also want to point something else out: Even black slaves in the South, before the Civil War, did not typically work 80 hours a week.  Slaves were expensive, and owners were fully aware that frequent 16 hour work days would cause injury, making the slaves unable to do their work.  Now days, this is not a problem.  Most employers do not pay any up front costs, so working people to death is quite profitable, so long as there is a line of replacements waiting outside.  If you are working 80 hours a week just to get by, your employer considers you expendable.  Does a wise person continue to work a job where the goal is extract as much work as possible before throwing an employee aside and bringing in a fresh one?  Further, slaves in the South did not have to worry about where their food was going to come from, or how they were going to make rent next month.  It sounds like your employer is great.  You are working more than a slave, you get less compensation than a slave, and you do not even get to have the peace of job security.   Your life sounds awesome.

So what about the people that are working 80 hours a week, but could get by on less?  Can I ask, what is the point?  I suppose if you like your job a whole lot, maybe you could justify neglecting family and other relationships to spend all of your time working.  And maybe you do not really care about having time to put all of your hard earned money to good use.  Oh, and maybe you just do not care about your health either.  Actually, you are beginning to sound like someone with severe depression.  You should probably get some help for that.  In fact, people who spend all of their time working, even if they like their jobs, are at much higher risk for depression than others.  Not only does this cause major relationship problems, depression causes reduced productivity.  In other words, if you are working 80 hours a week, good luck continuing to be productive enough to keep your job.

There was a time, long ago, when working for someone else was humiliating.  The single exception was apprenticeships, where someone was paying for you to work for someone else, in exchange for that person to teach you their craft, so you would never have to work for someone else again.  Some jobs were not that bad, for instance, being a banker.  Obviously not every banker could afford to open his own bank.  The President of the United States (or any other elected government position) is another example.  On the whole however, most hired jobs were menial labor or mindless brain work that only the desperate were willing to do.  Many people bought land and started their own farms, to avoid the humiliation of working for someone else.  These people often worked 60 to 80 hours a week, and those with large families sometimes worked more.  They worked for themselves though, and they took pride in the fact that no one was their master.  During hard times, these people sometimes had to humble themselves and accept welfare from others, but while they preferred not to, they considered it better than working for someone else.  During better times, these people often paid it back by giving to others who were in need.  They only ever worked for someone else when there was no other option.  Working for someone else was rock bottom.

Now, however, I hear reports of people bragging on Facebook that they spend 80 hours a week working for someone else and barely even get compensated sufficiently to survive!  How low has our society sunk that the slave looks down on the freeman?  How bad has it gotten that social pressure encourages people to voluntarily become slaves, and people do it?  Gone are the days when slaves must be guarded day and night to prevent their escape.  Now, slaves will stay in bondage even when there is no legal ownership demanding that they stay.  Is this really the American Dream?

It gets worse.  The U.S. government has setup programs that make the situation even worse.  I would like to believe that this is not intentional, but that might be naive.  Our current welfare systems help encourage slavery very effectively.  Contrary to popular belief, a majority of people on welfare work.  In fact, things like the Earned Income Credit require work to get paid, and most welfare in the U.S. penalizes people who are either not working or who are not looking for work (which is far more work than actually working).  So, the government tells us that we can only have sufficient welfare if we are already slaves or if we are trying to become slaves.  Worse, if we try to escape slavery by finding a job where fewer hours are required for greater pay, our welfare gets cut off, reducing our net income to below what is sufficient for survival.  Slavery may not be explicitly legally enforced, but even the government has designed welfare programs to prevent the escape from slavery.  It gets even better than this though.  All of that money for the welfare system comes mostly from the middle class.  The primary benefit of welfare goes to businesses that do not pay enough for their employees to get by.  You might try to claim that welfare is benefiting the employees and not the businesses.  You would be wrong.  If those employees did not get welfare, what would happen?  Many of these people already work multiple jobs, or spend their free time looking for a better job.  They do not have time to do even more work.  Without welfare, they have three options: keep working for too little to get by and starve to death, quit their jobs and start looking for better work full time (and probably starve to death anyway), or quit their jobs, enjoy life and wait for the end to come (and definitely starve to death).  I suppose they could try to get help from family, but the extra strain on their family members is likely to make the poverty spread.  Ultimately what happens to the businesses is, they loose all of their employees to starvation, suicide, or just quitting, and the businesses fail.  Alternatively, the businesses could raise wages to make up the difference.  Now, notice that the business ultimately takes the brunt of the blow if welfare quits.  The business fails, even if the ex-employees find a better job, unless the business starts paying higher wages.  That welfare money is not helping the slaves so much as it is reducing the costs of the businesses.  If businesses fail and demand for their services still exists, they will be replaced by new businesses that will pay sufficient wages, if welfare is removed.  Otherwise those new businesses will fail almost immediately.  Our current welfare system robs the middle class to make life easier for the slave masters, under the guise of robbing the rich to pay the poor.

Now, do not get me wrong here.  I am certainly not suggesting that it would be wise to eliminate government welfare to fix this problem.  Notice above that the ultimate consequence for most welfare recipients is starvation and death.  This is bad, both morally and economically.  I have also discussed the inevitability of welfare as the primary form of income for a majority of the population, as the consequence of rapidly developing technology.  Welfare is necessary for the survival of our society, and it is necessary for the support of our ever growing lower class.  It needs to be fixed to be more friendly to escaping slavery, not eliminated.

Most Americans seem to have everything working against them.  Slave masters have convinced us that slavery is more noble than freedom.  This has been so effective that now the slaves brag on Facebook about their horrible work conditions and try to tell free men that they are stupid for recognizing it for what it is.  Over the last century, mankind has managed to produce a huge amount of labor saving technology.  We also managed to get the legal work week down to 40 hours.  This was a long drawn out process that was finally codified into Federal law in 1937.  Is has been like this for 76 years, during which time numerous devices have been invented and adopted which dramatically reduce labor.  In fact, the computer has replaced inordinate numbers of people, just in the field of mathematical calculations.  The need for labor is far lower than it has been ever in the known history of the world.  It has been argued, and I agree, that a 20 hour work week should pay enough to support a small family.  Given production costs for various goods and their retail prices, large business can easily afford to pay sufficient wages for this, if they quit paying their CEOs and upper managers hundreds or thousands of times what they currently pay their other employees.  Instead we have laws and social expectations that keep what is approaching a majority of Americans in poverty.

If you want to complain that you are working 80 hours a week, or even that you and your spouse are working 80 hours a week put together, just to get by, you are telling me that you are stealing needed work from three other families.  And one of those families is the person that you just accused of freeloading.  Those three families that cannot find decent jobs and are forced to live on crummy government welfare because you are stealing their opportunity to work have far more reason to be angry with you than you have to be with them.  At this point, government welfare is designed to make up the difference between what your employer pays and what wages would be fair.  By taking four times the work you should just to avoid taking what you consider an unfair handout, you are robbing others of the opportunity to earn money by working and the opportunity to get enough welfare money to make up the difference.  That is certainly nothing to brag about.  Shame on you!

18 November 2013

Video Games Rant

Last week in church, our lesson was on prioritizing time.  Actually, the official subject was avoiding being worldly, without excluding yourself from the world.  The two subjects that got the most attention were money and time, and time got the most, because money is always discussed, so it is nothing new.  Anyhow, at one point the teacher proudly told how he had prioritized his time better, by playing less video games and doing more physical activities like running.  I kept my mouth shut, because the subject of the lesson was not whether video games or physical activity is more valuable, but now it is time for me to rant on this subject.

I regularly hear people say things that devalue video games and exalt physical activity.  The common "wisdom" of modern society tells us the video games are time wasters that have no value beyond entertainment.  Combined with this, the fact that the occasional person gets addicted to video games, common wisdom says that video games are entirely bad.  In addition, common wisdom tells us that violent video games make kids violent.  Given these things, it must be obvious that violent video games are especially bad.  There is one problem with the common wisdom though: So far, all scientific evidence contradicts these claims.  It actually turns out that some of the minor benefits of video games were fairly well know over 20 years ago.  In fact, in a public speech on Aug. 8, 1983, Ronald Regan said, "I recently learned something quite interesting about video games. Many young people have developed incredible hand, eye, and brain coordination in playing these games. The air force believes these kids will be our outstanding pilots should they fly our jets."  It turns out that the long mocked benefit of increased coordination from playing video games is actually a very valuable asset in the military, among other professions (for instance, surgeons also need very good coordination).  Now, I call this only a minor benefit, because real science has found a large number of other benefits that go far beyond the limited applications of highly developed coordination.

First, I recommend watching all of the TED videos by Jane McGonigal (here is one).  In one of her videos, she talks about a study where people in rest homes were asked about their regrets and what they wished they had done.  The first notable thing is that none of the regrets included playing games too much (video or otherwise).  The things they wished they had done included things like spending time with family, working less, and other activities that games can help facilitate.  Another one of her videos discusses how adding game-like elements to real life can help improve motivation and even health.  She strongly recommends playing games (video games or other games) cooperatively with other people at least 20 hours a week.  In the video linked to above, she says that typically 10,000 hours of doing some activity is considered sufficient to become an expert in that activity.  She then points out that there are millions of people in the U.S. that have spent enough time playing various video games to be experts.  She asks, what are they experts in?  The answer is critical thinking and problem solving.  McGonigal says that if we can harness the problem solving power of all of these experts, we might be able to fix most or all of the world's problems.

Second, brain scientist Daphne Bavelier discusses actual scientific research in this TED video.  Science manages to debunk a large number of myths about playing video games.  One notable one is that playing a lot of video games damages eyesight.  The research Bavelier is involved with has shown that action games actually improve eyesight, as well as perception.  Gamers are more likely to notice small details than others.  She also discusses the claim that playing games leads to ADD or ADHD.  Again, this is false.  Gamers tend to be better at fast decision making where high focus is required.  They also tend to be better at keeping track of many things at the same time.  These are both things that people with ADD or ADHD have difficulty with.  Quantitative measures from Bavelier's research actually show that gamers are about twice as good at focusing compared to other people.  She also discusses several other lab tests they did, all of which found that gamers have greater brain development than people who do not play games.  After playing the games for several hours or several days, the above mentioned benefits were seen even in people who do not normally play games (and lasted up to 5 months).

Third, many studies have found that violent video games do not make children violent.  In fact, some studies have found that violent video games can work as an outlet for aggression, making aggressive children less aggressive towards other people.  The only case where video games and violence had a positive link were in people already predisposed to violence.  Otherwise stated, people who might be affected by video games to become more violent are people who already have psychological problems tending towards violent behavior.  These cases can be diagnosed before playing violent video games, and there are almost always warning signs long before any disastrous behavior occurs.  In other words, violent video games do not make anyone violent.  There is a very small amount of evidence that they may trigger violent behavior in people with diagnosable mental problems, earlier than would otherwise occur, but they will not cause violent behavior that would not have eventually happened anyway without video games.

There is far more evidence than this that games are good and healthy, and the only evidence that exists that they are bad only shows that they are bad when played obsessively.  Real research shows that common wisdom claiming video games are bad is not just wrong, but it claims the opposite of the truth.  Video games are actually very healthy for the brain.

I am not done ranting yet.  There are some interesting implications of the above that I want to discuss.  The first thing about the Sunday discussion that bothered me was the assumption that physical activity is superior to playing video games.  Now, I am not saying that physical fitness is not important.  It certainly is, but if a person is already reasonably fit, and that person has to choose between games and exercise, which is the better choice?  The answer to this depends.  In a society where people have to do a lot of physical labor to survive, further increasing physical prowess is extremely valuable, and physical activity would be the clear winner.  We do not live in that society though.  We live in a society where physical labor is done primarily by the lower class and is compensated fairly poorly.  In our society, the high paying jobs are engineering jobs, which require highly developed problem solving skills.  The ability to perceive small details is a huge asset in higher paying jobs.  Even many higher business jobs require good problem solving skills.  Further, getting those physical labor jobs is pretty difficult when the economy has trouble, because almost anyone can learn to do that work.  During the recent recession, when construction jobs and nearly every other physical labor employment dropped considerably, jobs in problem solving (most especially engineering jobs including electrical engineering and computer science) were constantly available.  For these problem solving industries it was business as usual: There was high demand for employees with a low supply.  In short, so long as a person has enough physical activity to avoid health problems, the video games are more valuable in the long run.  Going to the gym or hiking with friends is great, but it seems like a poor decision to sacrifice good brain exercise for physical activity that is little more than entertainment.

So, would I be proud of myself for spending less time playing video games and spending more doing physical activity?  Certainly not.  I currently spend around an hour a day walking to and from classes, every weekday.  About 3 of those days, I run part of the way, to get my heart rate up and to maintain good lung function.  I consider this physical activity an important part of my routine, because it is necessary to maintain good health.  I am not an athlete, but this is sufficient exercise to maintain quite good health.  My weekly total of time spent playing video games is probably several more hours than my weekly total of real exercise.  As a Computer Science major, my problem solving skills are more important than physical activity, so long as I remain healthy.  Since my exercise routine is sufficient for health, it would be rather irresponsible of me to spend less time playing video games so that I can spend more on physical activity.  Spending less time on video games could easily result in a reduction of problem solving skills, which could make my work less valuable to an employer.  I have a responsibility to provide for my family, and an important part of that is maintaining a high value to my employer, so my employment will be continued.  The consequences of playing video games less, no matter how noble the alternative seems, could easily result in letting my family down.

I have said this before (somewhere; I forget exactly where): Playing video games is like working out for the brain.  In physical workout, great strength can be gained, but it has no value if it is never used.  Similarly, through video games, great mental strength can be gained, but likewise it has no value if it is never used.  From this we should get two things: First, sitting in a basement playing video games constantly is truly a waste of time if the benefits are not used.  Second, playing video games is like working out!  Yes, that is exactly what I said.  If you are a parent and you are worried about how much time your children spend playing video games, ask yourself how you would react if your children were spending that time working out at the gym instead.  If your reaction to the gym is less negative, something is wrong.  Are you hoping your child will become a garbage collector or a cheap construction worker (there are some highly skilled construction jobs that require less physical work and more mental work), or do you want your child to be well suited to be an engineer, a surgeon, or a lawyer (or a CEO or...)?  Similarly, if you have a spouse or boyfriend/girlfriend who plays a lot of video games, consider the gym thing again.  It is possible to spend too much time at the gym, and if someone in your life is spending that much time playing video games, then it is perfectly reasonable to be concerned.  Otherwise is it completely hypocritical.

Now I am sure someone out there is wondering, "If video games are so beneficial, how did ancient humans get by without them?"  That is not an entirely unreasonable question, and it conveniently has a very simple answer: They lived them!  Ok, so that sounds both melodramatic and slightly absurd, but it is also true.  The most beneficial video games are those that require remembering where things are (mapping), noticing small details (perception), and making important decisions very quickly (decision making and problem solving).  Before we had all of this technology and knowledge, people had to hunt for food, they had to avoid or fight neighboring tribes, and they had to avoid predators.  Getting lost (mapping) was very often deadly.  If a person did not notice the stalking predator or warrior from a nearby tribe (perception) it would also result in a quick death.  If a person reacted poorly to a situation with a scout from a nearby tribe (attacked a friendly or tried to talk to a hostile; this is decision making), it could result in quick death or unnecessary war.  All of these things were literally developed as a consequence of the difficult conditions of real life.  People that live in a society without these difficult conditions do not get these opportunities to develop problem solving skills that are far more important than we give credit for.  Really, the only other reasonable (and cheap) place to get this training is video games that simulate similar conditions to those that ancient humans lived their entire lives in.  Many modern video games manage this very effectively.  The benefit with video games is that they do not require the grueling physical exertion (which can lead to poor choices as the body uses resources needed by the brain) and do not have the constant threat of death.  In other words, in games we can make mistakes and learn from consequences, without having to experience them first hand.

The moral of this is that video games are good for us, if we take advantage of the benefits they give us.  Even those who are traditionally against entertainment and for "going back to the land" should see the value in video games, as they train the human brain for problem solving in the way it was designed to learn.  Video games are more or less the most natural form of learning, aside from living under the constant threat of death.  I suspect part of the reason so many people are drawn to video games is that they use the brain in the most effective way possible.  Learning to play a new video game is easy, but learning a new school subject of equal complexity is often quite hard.  Games provide constant motivation to play.  In school, the only real motivation is a light at the end of a very long tunnel.  Games reward us for learning as much as we can.  School rewards us for doing as little as possible that will get us through it.  We can see this clearly by the fact that "piston" sometimes returns a Minecraft reference as the first result in a Google search (most of the time it is second to Wikipedia).  Many other searches return World of Warcraft results before real world knowledge.  Well designed video games can take advantage of this to teach useful real world applications of knowledge, but even games designed purely for entertainment can teach problem solving skills that cannot effectively be learned anywhere else.

Like Jane McGonigal, I recommend playing 20 hours of games a week (see her videos for the exact context).  Further, I recommend that at least 5 hours of this be video games where keeping track of location, noticing small details, and making quick decisions are integral parts of the game play (these do not all have to come from the same game, but I think it would be more efficient use of time if they did).  The most obvious genre for this is first person shooters (FPS, which I do not happen to enjoy much), but most real-time strategy games (RTS), role-playing games (RPG), and many games from other genres can satisfy these requirements (even Tetris is good for decision making).  Currently I have been playing a lot of Minecraft, which has small amounts of quick decision making, a huge amount of mapping, some small details, and an extremely strong focus on general problem solving (and if you do not like killing the occasional zombie or spider, you can set it for "Peaceful" mode, at the expense of quick decision making).

It is not hard to see major benefits from playing video games, unless you think that somehow a few more hours of physical activity is more valuable than useful work skills that are essential to comfortable survival in the modern world.  If you really want something to be proud of, quit wasting so much time on physical activity and spend a few more hours a week playing video games (or better, turn of the TV and spend the extra free time on video games; they are at least as entertaining and far more beneficial).

Lord Rybec

01 November 2013

Does poverty make people stupid?

Today I read a fascinating article that should completely change how we think about poverty.  Recent research has found strong evidence that poverty impairs cognitive function.  Researchers cited in the article (found here) describe it as something of a limited resource that poverty consumes very quickly.  Evidently the human brain has a limited capacity for difficult decision making.  Once that capacity is used up, it takes time to recharge, and during that time, the ability to make wise decisions can be dramatically reduced.  It happens that people in poverty have to make these difficult decisions (which bill should I pay, or how will I get to work without a car) a lot more than well off people.  In reality, poverty does not make people stupid, but it does tax their decision making abilities to a point where they are reduced considerably.

Several years ago, I read about a study that found that black people have a slightly lower average IQ than white people (in the U.S.).  Before going off, this study was not racist, and in fact the researchers went to great lengths to explain that the most likely cause was not racial.  Their final conclusion was that intellectual blacks tend to have less children than those who are less intellectual.  The result, according to the researchers, was that this process was a process of natural selection, selecting for lower IQ.  This more recent study offers a much more likely explanation.  In any kind of research, there is a thing called a confounding factor.  Confounding factors are things which might taint the data, but which cannot be controlled.  If they have a large impact, they can entirely invalidate the conclusions of a study.  In studies of psychology (IQ falls into this category), confounding factors are often not all known.  It turns out that the recent studies on poverty bring up a major confounding factor to the IQ study, which was not known when the IQ study was conducted.  In the U.S., a larger percentage of the black population is in poverty than the white population.  If the sample for the IQ study was selected with accurate diversity, this would have been reflected in the results.  Since we now know that cognitive function can be diminished by poverty, there is a much better explanation for the results of the IQ study: Black people do not actually have a lower average IQ than white people.  Because a larger percentage of the black population is in poverty, black people score lower on IQ tests than white people.  The poverty study gives reasonable evidence that the lower scores are a result of the poverty, not of lower IQ.  Ultimately, the IQ study is really worthless, because the confounding factor was not accounted for, thus the data is tainted.  That said, the point of this is not to discuss how poverty affects IQ scores.  This is just an example of how poverty can affect critical thinking.

One problem the article points out is that the diminished decision making skills leads to a problem.  People who are in poverty have a diminished capacity to escape poverty because the poverty reduces their decision making skills.  In other words, poverty is a self perpetuating trap.  It gets worse though.  Our welfare system, which is supposedly designed to help people get out of poverty, adds even more decisions.  To get on food stamps, you have to do a bunch of complex paperwork, which further consumes cognitive capacity.  If you don't get to your appointment exactly on time, you have to reschedule, even if the government workers have time free, and you were late because of an emergency.  Many government run employment centers have similar policies.  One of the results is that people in poverty tend to have poor time management skills, which makes them more likely to be late to an appointment, which adds more difficult decision making, which further reduces their time management skills...  This keeps going.  It results in poor diet decisions, poor time management, poor relationship decisions, and all sorts of other poor decisions that help maintain the downward spiral.  There is another problem though, which is not mentioned in the article.

The U.S. economy is having a lot of problems (if you were not aware).  Ironically, most of the problems are related to poor decision making.  Now, one thing I have to note.  The U.S. government has set the official, legal poverty level at something around $20,000 a year for a family of 3 or 4 (I forget the exact number, so don't be too surprised if you find they are a bit different).  External organizations, who do real research on cost of living and such estimate that it is closer to $40,000 a year, with a lot of fluctuations depending on where in the U.S. you live.  This means that the entire lower middle class and a good portion of the middle class are technically in poverty.  This lower and center middle class is the same demographic that went into the excessive debt that helped facilitate the recent economic crash.  Poverty is bad for the economy.

Now that I have mentioned that much of the middle class is in the poverty category that is affected by the decision making problem, I want to look at some of the bigger implications.  Our education system is pretty poor, but maybe part of the problem is the poverty of many of the students (even without that, our education system is terrible; this might be yet another contributing factor though).  Also, most grunt laborers and lower management are technically in or near poverty.  To get straight to the point, maybe the problem with Americans being less than ideally educated and acting pretty stupid a lot of the time is that most Americans fit into some level of poverty that is affecting their decision making skills.  Consider the typical store manager.  He (yes, more males are store managers than females, which makes the typical store manager male) is probably making $40,000 a year or less.  If he has more than one child (and a wife), he is probably in the upper levels or poverty (depending on local cost of living).  Now, in addition to having to make difficult money decisions for his family, he also has to make more difficult decisions daily for his job.  No wonder a lot of store managers frequently make poor decisions.  Even without the poverty, it is likely that the average CEO is way out of his league in difficult decision making, not because he has some problem, but because humans have limited capacity for making difficult decisions.  Unfortunately, in many ways, this is just a problem we have to live with.  Even the richest guy in the world is subject to the limitation, and further, with all of the responsibilities that typically come with being rich (or rather, that are an inherent part of being rich, because the process of getting and maintaining riches typically requires a lot of hard decisions), he probably is more subject to it than middle upper class people who are not in management positions.

Anyhow, the point is that redistributing wealth to minimize poverty is very likely to result in a far more robust economy, better relationships, better society, and a smarter America.  Getting people out of poverty is no longer just a nice humanitarian thing that we should do because it is the right thing to do.  It is a means of raising the standard of living of even the most rich people in the U.S.  Imagine if we could take all of the people currently in poverty and have them becoming intellectuals working to advance science and improve technology.  Imagine if we could get all of the lower class workers and lower management to a point where they consistently make decisions that are better for their employers and themselves.  Imagine if we could get most of the middle class to use more wisdom in making decisions about debt.  Businesses would run more efficiently, fewer costly divorces would happen, fewer violent crimes would happen, technology and science would advance more quickly, and the U.S. economy would reestablish itself as the leader of the world economy.  So maybe the results would not be quite that dramatic, but even if we managed to make small improvements, the end result would be worth it.  There are plenty of other reasons to seriously consider aggressive redistribution of wealth, including the fact that, barring unforeseen disasters that set back our technology dramatically, it will eventually become necessary to maintain any kind of economy in the U.S.  The sooner we take action, the sooner we can benefit from an average increase in cognitive capacity in the U.S.  Video games are already slowly raising the average IQ in the U.S. (it is true), eliminating poverty promises to give this process a major boost. 

28 October 2013

Advanced Housing

In the U.S. we consider our society to be an advanced civilization.  What qualifiers are we using to support this claim?  Well, we have advanced technology.  I could not, in good conscience, say that our behavior is any more advanced than the average ancient society though.  How about necessities?  We produce 5 times the food that we need (60% of which is exported and 20% of which is wasted).  We are definitely advanced here, though it does little good if there are still people starving.  I think that we are also fairly advanced as far as clothing goes.  Clothing is pretty simple; if it keeps us decently covered, sufficiently warm, and well protected from the elements, it is about as good as it can be.  Anything beyond that is beyond necessity.  The last true necessity is shelter.  We have some pretty advanced temperature management, and for the most part, our houses protect us from the elements, but there are some problems that are clearly less obvious than they should be.  This is what I would like to discuss.

I have been interested in and exposed to a lot of different types of houses in my life.  This has mostly been with reference to building materials.  Most modern houses in the U.S. are built out of wood.  In some areas, you can find houses or apartments made chiefly out of cinder blocks (our current apartment is made from cinder blocks), but they are less popular, because they are not well insulated.  Large buildings are often made from reinforced concrete, but these buildings are not typically used for housing.  On the whole, a vast majority of U.S. dwellings are made from wood.  This leads to a number of common problems.  The first is vulnerability to fire.  Wood houses are easy to burn down, and accidental fires are a major cause of accidental house destruction.  In many areas, wildfires cause panic, because of the fear of houses being burned down.  The second vulnerability is susceptibility to invasion of other life forms.  Wood houses are almost impossible to effectively seal.  Many animals, bugs, and insects have the capacity to make holes in wood, and some even eat wood as food.  This makes wood houses susceptible to rodent, termite, and ant infestations, among a large range of other creatures.  Another life form of concern is mold.  Damp wood makes an ideal habitat for several dangerous types of mold.  Third, wood is a fairly weak material compared to what is currently available.  If it is dry enough to prevent rot, it is brittle and can be damaged by various stresses.  These stresses are common in some areas, in the form of earthquakes.  The average one story wooden house will probably survive a moderate quake fairly well.  Adding another story or two will make the typically wooden house a dangerous place to be during an earthquake.  In addition to this, wood does not insulate very well and also is not very airtight.  As such, extra effort has to be expended to insulate and seal wooden houses, and even a well insulated wooden house must expend a lot of energy on temperature control.  The point is, wood does not seem to be a very good material for building houses.  Maybe the only real benefit is that it is easy to work with, and it can be produced fairly cheaply.

Now I want to look at how ancient civilizations handled this particular problem.  Early civilizations (some might even call them pre-civilizations) lived in caves.  Caves are very expensive for temperature control, and they allow all sorts of creatures in.  On the other side, they have no vulnerability to fire.  As civilizations started to develop, many moved out of caves and started living in tents, typically built from wood and skins.  These were vulnerable to everything, except earthquakes (if it falls down, just pick it back up), but they were highly mobile.  From there, most civilizations started making wooden buildings, of which we have already discussed the wide range of vulnerabilities.  These were more permanent and fairly cheap to build, especially in quantity.  Occasionally things like fires would wipe out whole sections of cities, and most houses were infested with something (a major factor in the spread of disease).  Some civilizations actually skipped wood though, and there is evidence that many others advanced from wood to other materials.  The final evolution of shelter for most ancient civilizations was stone or some form of concrete.  The largest native American civilizations built primarily out of stone, a superior form of concrete which we still have not managed to reproduce, or adobe, a form of clay used as a concrete (some may claim that this is because they had no other option, however, evidence has been found that prior to stone and concrete, wood was a common building material in ancient America; unfortunately, because wood decomposes fairly quickly, the extent and duration of building with wood is almost impossible to determine).  These materials are fire proof, earthquake resistant, and are much easier to seal than wood.  They are also not a form of food for pretty much anything, and they are very difficult to dig or burrow through.  Another benefit is that they impose a large degree of natural temperature control when built correctly.  We have found that our wooden houses decay into worthlessness in between 100 and 200 years, without constant maintenance.  Houses built of stone and concrete have lasted thousands of years with absolutely no maintenance.  Till recently, stone and concrete has been the highest level of evolution in materials for building dwellings.  Ironically our "advanced" civilization is still building mostly out of wood.

So, we think we are more advanced than ancient civilizations.  If this is true, shouldn't we be building houses of even better materials than they did?  What materials do we have that are better than stone and concrete, that can be used to build houses?  Stone and concrete are actually pretty good building materials themselves.  We already use a somewhat superior version of those for modern buildings.  We reinforce our concrete buildings with metal, in the form of rebar and remesh.  Our concrete is not as good as some ancient American civilizations, but we still have not figured out how to reproduce what they made.  Where we really excel is in making metal in large quantities.  It turns out that steel shipping containers make good houses.  Multiple containers can be combined to make houses of any size, and they can be stacked several stories high.  They are already fairly well sealed, and adding a thin layer of foam insulation will further seal them as well as insulate them very effectively.  We could be making houses out of reinforced concrete or even metal.  I already mentioned the benefits of concrete housing, metal housing is even better.  Metal housing needs insulation more than concrete, but with foam insulation, it does not need much.  Metal is extremely resistant to invasion by any other life form.  It is fireproof.  It is earthquake proof.  Metal houses that are well sealed with a layer of foam insulation will last forever.  Metal is a bit more expensive than wood or concrete, however there is already an existing industry making shipping containers, and containers that are no longer suitable for shipping are often still suitable for building houses with.  In short, production could be scaled to make them cheaper, and used containers are already reasonably priced, compared to the cost of building a wood house from scratch.  The logical progression of house material evolution is metal, and we have everything we need to do it.

There are some other amusing, but promising options.  Some people build houses out of straw bales.  This sounds counterintuitive, and it is.  Straw bales are so dense than they don't burn well.  They don't rot because they are installed dry, then covered in a thick layer of stucco to protect them.  Their density makes them pretty structurally sound, and they flex enough to withstand earthquakes fairly well.  Straw (not hay) is very low on nutrition, so most animals will not eat much unless they are starving.  Further, if the stucco is applied correctly, nothing will be able to get in to infest the straw.  Straw also makes extremely good natural insulation.  Unfortunately, straw prices have increased recently, but during good economic times, straw is often very competitively priced, compared to materials use for making wooden houses.  It is hard to imagine straw as a superior building material to wood, but the evidence indicates that it is.  It is also renewable, just like wood.

Really, we don't necessarily have to one-up ancient civilizations to be truly advanced.  Concrete, even without metal reinforcement, is an excellent building material for houses.  It is also very cheap.  In some 3rd world countries, notably Pakistan, some entrepreneurs have bought small cement mixers and concrete block molds, which they setup at the side of roads on the outskirts of large cities.  They then proceed to make large numbers of concrete blocks at a very fast rate.  These blocks are used to build illegal houses, which even the very poor can afford, because concrete is so cheap.  These small single room houses can be built at a rate of several a week, with a small operation.  This technique for building houses is very scalable, so long as the houses have only one story.  In short, concrete can be used to build very resilient houses, at a very fast rate, and very cheaply.  Using casting techniques can increase customizability with only moderate cost increases.  With casting rebar and remesh can be added to increase strength, again at a moderate cost increase.  Concrete is extremely versatile, as it can be used to produce houses with a very wide range of quality.  While metal is probably the most advanced material we can build houses out of (currently; maybe carbon composites will be even better), concrete is possible the most superior when cost and versatility are taken into account.

Why do we still build houses out of wood, if there are other superior materials that are often cheaper?  Tradition.  Seriously though, the main reason we still use wood as the primary material for building houses is that we have been doing it for so long.  People like the aesthetic of wooden houses, even if the wood is not visible.  Things like sloped roofs are difficult to build in any material, but they are common with wooden constructions because wood is structurally weak, and sloped roofs can handle the weight of several feet of snow, while flat roofs made of wood might collapse.  Wood is easier to make sloped roofs with than metal or concrete.  Wood is fairly cheap, and our building industry is well scaled to use wood.  In most areas, obtaining enough wood to build a house is fairly easy with mostly local sources.  Obtaining enough concrete to build a house is more difficult.  Metal is difficult to work with in a raw form, so unless parts are prefabricated, no one wants to build with it.  It is just more convenient to build with wood.

Another big reason we still use wood is legal building codes.  This is a tradition thing as well.  Legal building codes all assume that houses will be built with wood.  Running electrical wiring in a storage container house is not that difficult, if code allows the metal body of the house to be used for grounding.  Unfortunately, because code assumes that houses are not built of metal, a separate grounding wire is required, which increases the difficulty of running wiring in a house where the walls do not have space between them.  In addition, even metal houses must comply with electrical codes designed to prevent house fires, even though metal is fire proof.  There are already some difficulties using modern building materials to properly outfit a metal house, and building codes that assume houses will always be made of wood eliminate most of good solutions.  This is also a problem with straw bale houses, because they don't normally have space between the walls either.  By basing legal building codes on outdated assumptions, we have effectively stalled technological advancement in house building materials.

Is there some way we can fix this?  Well yes.  First, building codes need to be more flexible.  They could either be expanded to address requirements on a per material basis, or they could be abstracted to replace specific requirements with abstract requirements that allow more flexibility.  If it is absolutely necessary that current wood-based building codes maintain their strictness, the first option would be the best.  This strategy would require the addition of a new section for each new material that is added to the list of possible house building materials (so, if someone decides to build a carbon composite house, a new section of building code would have to be added to cover it).  If it is more necessary to maintain flexibility for adding new materials, then the second option would be better.  The downside with this plan is that it would probably leave loopholes that could result in a decreased standard of quality for wooden homes.  One way or another though, building codes need to be flexible enough to accommodate new building materials in a timely manner.

Second, we need to get over the wooden house architecture attachment that our society has.  Beautiful architecture is great, but if it comes at the cost of safety, it is not worth it (in my opinion, most wooden house architectures are far from beautiful).  Safety aside, it is possible to have beautiful architecture without wood.  We can easily look to middle eastern architecture, where wood was and is very scarce, to find beauty without wood.  Some versions of the ranch architecture, which is popular in the south western U.S. take much of their style from the abandoned adobe cities found in that region (in fact, stuccoed ranch style homes look very much like the ancient adobe houses).  Some ancient architectures that originally used marble or granite for the building material would work very well with concrete and are considered by many to be the most beautiful architectures the world has ever seen.  Even many parts of Europe have houses built from brick, which use an architecture not far removed from U.S. wooden houses.  This is evidence that we would not have to change the wood architecture much to get something more friendly to superior building materials.

There is a third hurdle, which is possibly the biggest.  Even if we get building codes up to date with modern technology, and even if people can learn to think beyond the modern wooden house architectures, our builders lack training to build out of anything but wood.  So, this is not entirely true, but for a majority of builders in the U.S., it is.  We build using wood because it is all we know.  Evidence that we are less advanced in this area than previous civilizations is that we don't have a robust stone masonry profession.  Bricklaying has become more of an art form than a building profession (we paste more bricks to the outside walls of wooden houses than we use to actually build walls).  I can hear some people saying that someone has to know how to make concrete structures, because that is what large buildings are made from.  This is certainly true, but they don't know how houses.  The small part of our building industry that makes concrete structures only makes big buildings.  We have amassed great amounts of knowledge of how to most effectively build wooden houses, but we know little about effectively making houses out of anything else.  Overcoming this hurdle is going to cost.  Training does not come for free.  In addition, to learn how to do things right, we will have to do them wrong a few times.  With concrete or metal, the cost of mistakes will likely be small, but they will be irreversible.  Even a big mistake is unlikely to result in the collapse of a house, but it could result in things like water leakage that requires regular sealing (water leakage in concrete or metal is far less bad than wood, but it can still cause humidity issues that might result in mold or damage to personal belongings).  That said, these issues already regularly occur in wooden houses, even if best practices are observed, due to the nature of wood as a building material.  If we don't start building with new materials though, we will never figure out how to do it right.

These three problems will ultimately be expensive to overcome.  Changing building code to make it keep up with technology will take a lot of political pressure, which ultimately will cost at least in time if nothing else.  Changing people's perception of architectural aesthetics will be even more difficult.  As with anything new, until the market for superior houses sees a lot more business, the cost will be fairly high.  Houses built from anything but wood will be treated as custom work until popularity increases.  Training is inherently costly, and this cost could be more problematic than either of the other problems.  When working with new materials, mistakes are bound to be made.  While the cost of these mistakes may be lower than the cost of maintaining wooden structures, they will be perceived as very high (we have gotten used to the costs of maintaining wooden houses).  This will hinder general public acceptance of houses built from new materials.  These three hurdles have a kind of synergy that will make overcoming them even harder and more costly, but the end result will be worth it.

As I mentioned before, most urban houses in Pakistan are built from concrete, and on the outskirts of urban areas, many illegal houses are built from concrete blocks.  These (fairly large) concrete blocks cost well under a dollar a piece to make.  The small illegal houses are typically made from twenty or thirty of these blocks, and they have only one room.  The cost to the "contractor" for each house is less than $30, for materials, and since labor is cheap, the total cost per house is probably well under $100.  The land is free (the land is technically owned either by the city or the province, which is part of why the houses are illegal).  In the U.S., we cannot get away with the "free" land thing, but if we learn to build using concrete, we could still reduce the price of housing dramatically.  One problem we have in the U.S. is high expectations for standard of living.  If you think that electricity, running water, and one bedroom per person (with a double sized master for a couple) is normal, you are deluded.  Most of the people in the world live in houses without running water, with little or no electricity, and with only one or two rooms for a family of four or more.  Now, I am not saying that we should live like that too.  What I am saying is, we should be happy enough to have electricity and running water, if our children have to share rooms, big deal!  In fact, most people end up spending most of their lives sharing a room with a spouse (or something).  If we give every kid their own room, we are doing them a disservice.  They need to get used to it (no wonder so many relationships fail shortly after getting into shared living arrangements).

Another problem is that we do not know how to handle concrete in a modular manner efficiently.  Our concrete building industry uses a technique that I am going to call "large casting."  This involves building wall molds, then pouring in the concrete.  Once the concrete is set, the molds are removed.  In essence, the walls are constructed by making very large custom molds, which are destroyed after the walls have been made.  We should all know that custom made anything is more expensive.  For buildings, large casting makes sense, because the height of buildings makes more modular techniques less safe and very difficult.  Houses are not large buildings though.  Even two or maybe even three stories will work with more modular techniques.  The Pakistani method of creating concrete blocks that are used to build will work fine if they are not stacked too high and do not have to be lifted very high.  The benefit of a modular system like this is that you can take the cement mixer to the building site, cast the blocks, then build the house, without needing to make a custom mold.  That last part is the key.  Making those custom large casting molds takes a lot of training to do right.  If the builder could just buy five or ten pre-made block molds (like builders in Pakistan), they would no longer need the labor of expensive concrete workers.  This is a big cost reduction.  If the block molds left holes in the blocks, rebar could even be added after stacking the blocks, to reinforce the structure.  In addition, concrete, in bulk, is cheaper than wood.  This is a further cost reduction.  Some of the cost reduction would be offset by higher costs for things like insulating (foam insulation would be mandatory to provide a vapor barrier, and because you cannot tack fiberglass to concrete very easily), but eventually the overall cost would be reduced as other industries scaled to the new materials.  In the end, houses could be built faster, cheaper, and higher quality, all at once.  With steel shipping containers, it is even easier.  Construction would be reduced to moving the containers, welding multiple container constructions together, insulating, wiring, and finishing (ok, HVAC as well).  Instead of building houses, it would be more like converting an existing structure into a house.  In short, once industry scales to new building materials, their cost will drop and probably to less than prices for current materials.

There is one other potentially major benefit to building with concrete and steel instead of wood: it is more green.  Right, I can hear you asking "how?"  Well, let me explain: Wood is a renewable resource (sounds like I am arguing against myself).  It has many uses.  Concrete is not officially renewable, but the materials it is made from are extremely plentiful and easy to obtain.  Steel is not so plentiful or easy to obtain, but there is plenty of steel in industrial waste, and further, I suggest that the ideal steel houses would be built from shipping containers that are no longer fit for shipping but still have a high enough structural integrity to build with (for shipping, they have to be able to stack many high, when full of heavy cargo; in contrast, houses would not stack more than 3 or 4 high and are mostly empty space).  So building with shipping containers is a form of recycling.  Now, we still have the wood thing to deal with.  A great deal of wood goes into building.  The processing for construction quality wood uses a lot of energy (sawing, drying, chemical treatment, creation of plywoods, etc...), and chemically treating wood introduces contaminates into the environment.  Instead of using wood for building, we could be using it as a renewable energy source.  Burning wood is a carbon neutral means of producing energy (so long as trees are replanted).  Coal power plants are already well qualified for burning charcoal, if not straight wood.  Technically, it would do no harm to allow non-carbon byproducts of wood burning loose in the atmosphere (these byproducts will mostly fall to the ground and be absorbed as new trees grow), but if people are concerned, the filtering methods currently becoming popular would work on charcoal smoke just as well as coal smoke.  So, building with concrete and metal reduces byproducts of processing wood for building, recycles certain metal products  at a very low cost, and frees up a major renewable resource for energy production.  All of this is in addition to the potential for reducing the cost of housing and making housing safer.

The only real things in the way of this are building codes and a high initial cost to get the industry going.  The benefits are greenness, lower long term housing costs, and consequently improved economy (reducing housing costs reduces long term debt, which increases discretionary income, which increases consumer spending, which, according to economists, is what drives a good economy).  We need to start pushing for our governments to extend or reduce building code laws to facilitate construction using advanced modern building materials.  The benefits of building with advanced materials clearly outweigh any costs.

16 October 2013

IGMO?

What is Indirect Genetically Modified Organisms?  Well, it is a term I just invented, but it is a process that was invented millenia ago.  Nearly all of the food that we eat is, technically, genetically modified.  Now, the term GMO, used in a legal sense, means an organism that is the consequence of directly manipulating the DNA of another organism.  Most GMO used currently modifies plants to make them resistant to strong herbicides, so that the herbicides can be used to kill potentially harmful weeds without harming the main crop.  The main argument against direct genetic modification is that the consequences of directly altering DNA are not well understood.  In theory, it is possible that a single very minor change to DNA could cause the production of a toxic substance in a plant or animal, that could make it unfit for human consumption.  While there are regulations in place requiring extensive testing, it is impossible to comprehensively test to ensure that nothing harmful is being distributed to the public.

Indirect Genetically Modified Organisms use a less intrusive method for altering DNA.  In fact, when the process was invented, DNA had not even been discovered.  Today we call this process "selective breeding" or "animal husbandry."  It takes advantage of the ideas behind natural selection to modify an organism to be what we want or need.  This takes longer than direct genetic modification, but it is also more reliable and predictable.  The interesting part is, nearly everything we eat is IGMO!  Cows, pigs, and chickens did not magically start out as domesticated farm animals.  Non-IGMO turkeys are scrawny, with tough meat.  There is evidence that wheat did not even start out as a grain (it would have been similar to amaranth and other pseudo-grains).  In fact, it is possible that the true grains in general are entirely man made.  Similarly, most vegetables and fruits we eat are the result of thousands of years of IGMO.

Is there anything we eat that is not IGMO?  Probably.  Most foods that are not deliberately cultivated by man are probably not IGMO, though it is almost certain that we have had some impact on them (through our impact on the environment).  Most fish are probably not IGMO, though, we do not know if previous civilizations might have practiced some kind of selective breeding with fish that were then let back into the wild.  Also, some fungi, like truffles, are still gathered in the wild, which means that they are probably not IGMO either (it is possible, however, that liberal gathering of easily visible truffles may have unintentionally caused genetic selections that resulted in making them extremely difficult to find; that said, pigs like truffles too, so it is equally likely that wild pigs caused this trait).

I would say that it is almost certain that man has had some kind of genetic influence on pretty much all food that we eat.  I would not use the term IGMO to refer to foods that have not been intentionally modified though.  So foods that have only been genetically modified as a side effect of our environmental impact or gathering patterns would not be IGMO, but foods that we have deliberately selected for specific traits would be IGMO.

Anyhow, the point of all of this is: There is very little food that we eat that has not been deliberately genetically modified by humans.  The means of genetic modification are varied, and it is possible that some are safer than others, but it is all still genetic modification.  I am not saying that we should not be skeptical of direct genetic modification used in the food industry, and there are plenty of reasons to oppose it besides potential toxin risk (for instance, some U.S. companies donate GMO seeds to 3rd world farmers to get them dependent on the better growth properties of the seeds, then stop donating and use patent law to extort money out them).  What I am saying is, fearing GMO for its own sake is absurd, because nearly everything we eat is some kind of GMO.

Is Socialism Inevitable?

A while back I wrote about how industrial automation is reducing available work.  I expressed my opinion that this is not necessarily a bad thing, and I showed how aggressive automation will eventually make widespread welfare essential to our economy and even survival.  I have gained some new information and insight on this subject, and I would like to revisit it.

First, I would like to summarize the automation thing.  As more and more processes are automated, more and more work is eliminated.  (Note that I prefer to say that work is eliminated, not jobs.  This is because jobs is an abstract idea that is subject to interpretation.  I could have twenty people do the same work that one person would do and create a bunch of jobs, but they would be worthless jobs.)  If we take this to its logical conclusion, eventually every piece of work that can be efficiently automated will be.  Machines will produce everything necessary for survival.  The catch is that there will be almost no work needed to maintain production.  In other words, there will not be enough work necessary to justify employment of a majority of the population.  The only option to maintain a stable economy (and survival) will be a robust welfare system that provides aid to a vast majority of the population.

Now, I recently read an article on "prosumption."  Many U.S. businesses have started having customers take over many roles traditionally filled by paid employees.  The article even talks about a restaurant where the customers cook their own food.  It is really little more than a combination kitchen and grocery store, scaled for the use of many people at once.  This allows the restaurant to hire fewer employees by making the customers do most of the work.  It turns out that not only is this strategy profitable, but many customers find the novelty of the experience enjoyable.  Unlike automation, this strategy does not reduce work.  Instead, it eliminates jobs by getting people to work for free (see why I differentiate between work and jobs).  Just like automation though, it alters the balance of monetary flow towards businesses and away from consumers.  Ignoring automation and taking this trend to its logical conclusion, everyone will eventually spend a majority of their time paying a lot of money to work for free.  This will eliminate most jobs, which will ultimately have the same impact as automation.  Now, I will admit that, unlike automation, this trend is unlikely to play out to its logical conclusion, but it is also very unlikely to go away.  It will probably eventually reach a balance where people start to realize that they are working for free and will be unwilling to go beyond certain limits.  They will still be willing to do some work for free though (for instance, self checkouts, where people work as their own cashier).

Together, these two trends have a synergy.  Automatable work will eventually be automated.  Most non-automatable work can become the victim of prosumption.  Cooking is very hard to automate due to different tastes and things such as allergies.  It is clear from the above example, however, that many people are willing to cook their own food in a restaurant environment.  Pumping gas is also difficult to automate, because every car has the opening in a different place, but most people in the U.S. have already been trained to pump their own gas.  With the advent of cheap 3D printers, more and more people will be able to do the work of engineers and artists, in creating their own products.  There will always be a few jobs that will have to be paid, most of which are high end engineering jobs and machine maintenance jobs (someone has to keep the automation working).  In the end though, automation and prosumption can together eliminate a vast majority of jobs.

What happens then?  Well, as I mentioned before, if there is not enough paying work to go around, there are two options.  Either most of the people starve to death, or those with all of the resources give to those without.  If the people starve, there are no consumers and the economy collapses.  If the economy collapses, no one has enough resources, the machines decay, production stops, and even the most rich die of starvation.  For society to survive, it is mandatory that the resources of the rich be distributed to the general population.  When there is barely any work available, it will become necessary that people be provided things which are necessary for survival, without working jobs for the privilege.

Now, I am not saying that full blown communist socialism will be necessary, and in fact, I think it would cause exactly the same problems here that it has caused everywhere else.  A highly socialized system truly is inevitable though, if we can manage to avoid economic collapse long enough to get highly automated.  We may even be at the tipping point already.  For most of the last decade, unemployment has been high.  The latest trend has been steadily increasing or at least high but stable unemployment rates.  This has lasted long enough that the numbers are beginning to get skewed as unemployed people give up on finding work and are no longer counted as unemployed (unemployment only counts people who are actively looking for work).  Experts keep thinking this trend is going to turn around, but the evidence is not supporting that conclusion.  It is possible that we have finally hit the point where there is literally not enough work to go around.

I believe that the recent recession is evidence that we are near or at the tipping point.  One major consequence of the recession is that businesses are becoming tighter with their money.  Industry in general is beginning to automate more to reduce risks associated with an unstable economy.  The more businesses can eliminate paid labor, the better they can handle fluctuations in the value of money.  In short, it appears that the recession has driven automation and prosumption forward another step.  We are seeing the result of this in unemployment rates that are not improving despite government efforts to stimulate the economy.  The current government shutdown has been predicted to cause more economic issues, as it has the last few times.  Further, government shutdown, regardless of impact on the economy, will make the economy feel less stable, which is very likely to push automation and prosumption even further.  If we are not at the tipping point now, we probably will be within the next year or two.

Currently there is a major problem with increasing welfare to mitigate the effects of extremely cheap production.  Our government has a huge amount of debt, and it does not have sufficient income to sustain a welfare system that is as robust as we need.  In part, this is because of bloat.  We have a huge number of social programs, many of which are unnecessary and several which serve the same purposes for different groups of people.  We need to reduce bloat by eliminating low value social services and by combining redundant social services.  This will not solve the problem, however.  The next step is fair taxation.  Besides the fact that it should be obvious that treating ethereal entities (such as businesses) better than individuals is wrong, the biggest potential source of taxable income is large businesses.  The problem is that large businesses also get more tax cuts than almost anyone else.  The only group that gets more tax cuts than businesses is the bottom of the lower class, who do not even make enough money to pay taxes at all.  The welfare system we need to support the huge population of unemployed that is likely to appear in the next decade or so cannot be supported by anyone except large businesses.  Ironically, it appears that taxing businesses exactly the same as individuals would fix this problem in a single shot.

We do not need stimulus to create more jobs.  In fact, stimulus does not work very well anymore, because large businesses use it to buy more automation (or to give bonuses to already overpaid CEOs), which actually eliminates jobs.  Government contract work can create some jobs, but typically not very many and only temporarily.  The problem is not that there are not enough jobs.  The problem is that there is not enough work to provide enough jobs.  In other words, jobs are becoming increasingly unnecessary to the survival of our economy and nation.  People still need to survive though.  The rich can clearly not be trusted to provide for the unemployed.  As such, it is necessary for our survival as a nation, as an economy, and as individuals, that a robust welfare system be created that is funded from the very top.  If this is not done, the ultimate consequence will be massive economic and then government collapse, along with mass starvation and probably violence.  Pure socialism is not inevitable, but without a good helping, we are in trouble.