Tuesday, March 31, 2009

Digital Rights Management

If you asked the average publishing house these days, they would say that having a DRM program is necessary to protect their work from being pirated to death by some hoodlum in Accra, Hong Kong, or Tblisi. But the consumer has spoken: DRM will not be tolerated. Especially after some software hacked the user's computer to keep itself installed at all costs. Hijacking the customer's things is a big no-no.

DRM itself may have a black name, but publishers have been taking to using it under another name, because in their hearts, they still don't really trust the customer like they used to.

So what is DRM anyway?



DRM is software that prevents the copying of the art piece. Before computers, music companies would put out records, movie companies would show their film in theaters, and if the customer wanted a copy, he or she would have to go out and buy it, because record-pressing equipment cost as much as a house and home theaters didn't exist. The publishers liked that fine.

Records gave way to the tape, which could be, gasp copied. However, the process wasn't easy, and some deterioration occurred. People still bought tapes of music they really liked, and though some cheapskate-ish people copied music, profits were now higher than ever, since people bought more tapes for portable boom-boxes, cars, and so on.

The tape gave way to the CD. The CD represents its information digitally, which means it can be, gasp, copied without error. Still, only a few diehards really violated the copyright, as it was a major pain in the ass. And CDs were cheap enough that it was easier to just go out and buy one.

On the movie front, the tape era was the VCR/Betamax tape time about 5 years later, and the CD era was the DVD. A larger-capacity version of the music-technology, in a way.

But lo, with computers, sound cards, and the internet, came a technology that really, truly, scared the publishers. Napster. Computers could store a reasonably close representation of a song, and endlessly copy this with no further loss of information. With networks and napster, people could copy music from people they had no other contact with. Publishers feared not only htat no one would buy a CD again, but that the very idea of music would be associated with free access. That from now on, music, movies, and anything representable as information would be expected to be free. It would be hard to sell more than one copy, which will not pay for the cost of production.

Publishers have so far reacted with a frenzy of lawsuits to try to stop the copying, and the DRM tech to prevent the copying in the first place. DRM is a piece of software that intercepts copying signals and prevents them. It only allows access under circumstances that the publisher feels reasonable, such as watching the movie. (Or listening to the song.) Access is denied for, say, copying the information to another computer. (Or if it is copied, to not be played there.)

The consumer rejected DRM. Why?


Copyright law does give publishers the right to control copying. However, DRM didn't stop there. Some publishers wanted to put out a media that could be played only once, and would thereafter be garbage. Want to experience it again? Well, you'll have to buy another copy then! And since copyright law gives the publisher a monopoly for obtaining it, they could very well make you use this model. A bad deal at any price, as far as most music fans are concerned. (Many fans can listen to a piece of music thousands of times. A movie fan can easily watch the same movie a hundred times.)

Furthermore, many DRMs grossly overstepped their bounds. Sony's attempt at DRM, linked above, hijacked the operating system in an attempt to prevent any sort of circumvention. Had Sony been even slightly more malicious about it, it would have been in a prime position to steal the customer's personal information (such as credit cards), or use the machine (that the customer purchased) for their own ends (such as running the engineer's simulations, or launching internet attacks at Sony's enemies.) Sony got quite a smacking for that.

What should companies do to make things better?


DRM is only tolerated when it is seamless. It should not be noticeable to anyone not attempting to copy the work. If anybody does notice it, you probably did it wrong.

Most consumers know that publishers will vanish forever if they cannot make money, and they don't want that to happen. One could launch music on older media, or just laugh off some piracy as a cost of doing business.

Know that the market will not blindly accept any business model you put forth, but customers will decide who should prosper and who should not.


What should consumers do to make things better?


Piracy ensures that there will never be any new music. Boycott works whose DRM offends you. Yes, that means losing out on some movies, music, and games. They're not a human right, and you could live fine without them. Pirating them tells the publisher that they're doing the right thing, and would succeed only if those damn pirates weren't ruining everything.

On the other hand, absolutely do not allow rootkits. Media you buy shouldn't try to hijack your things. It's your tapeplayer, cd drive, or computer, not theirs.

Petition congress to shorten the length of copyright. It's been stretched over time. At first, publishers got 28 years to sell their work before it became as free and public as air. Then the publishers wanted 50. Then 75. Now it's the life of the author plus another 70 years.

The public-domain release of old information is a major societal compromise. The publisher wants exclusive rights to it forever, the public wants to copy it immediately, and other producers want the inspiration for derivative works (like a remix, or a reinterpretation, or use in some other form) as soon as possible.

What would go wrong with the two extremes?


Let's start with copyright not existing.

Any idea you create can now be used by any person for any reason. If you hum a melody, a cereal executive can immediately make a goofy cereal commercial jingle from it. You get nothing. Very few people ever make music, and mostly as a hobby. Music, film, and art is rare, amateurish, and mostly made by people who are immensely rich. Most art is a bad rip-off of something that already exists, and when people do get artistic ideas, they keep them totally secret, lest they be stolen for bad commercials, ripped off because somebody felt like it, or parodied because someone had nothing better to do.

Now let's take it in the opposite direction. Copyright is eternal, and derivative works are treated like shoplifting the original.

BigBadCo (tm), slogan "We do things because it annoys you," buys up the rights to Beethoven's symphonies. If I then whistle the well-known first five notes of the fifth symphony, I now owe BigBadCo 5 cents. BigBadCo is in the ridiculous position of claiming that if I don't compensate them, that they will somehow go back in time 200 years and convince him not to write it.

Art is also rare in this scenario. If your art is too close to anybody else's art, they sue you until you are forfeit of everything. Television and movies do not contain music, as obtaining the rights is like pulling teeth. Songs that sound too close to other songs get the musician sued. All stories must have a completely original plot, which is harder than it sounds. Write an epic adventure, set in medieval times? Whoops, you just ripped off J.R.R. Tolkein. The Tolkein family (or whoever bought it from them) will promptly be suing your ass. If you claim that the universe is evil, you've infringed on H.P. Lovecraft, and can expect a call from the Lovecraft family lawyer. Oh, but suppose you take up rap, instead? Ooops, you just infringed on 50 Cent's drum line! Pay up!

Eventually, BigBadCo decides that it hates all art, and buys up the rights to everything it can get its hands on. The tangible forms are locked in a safe, and anyone discussing any existing art form is sued. Some people try to create original art, but it is such a minefield. Too close to any existing work, and BigBadCo will sue you. They will drop the suit for $300 and exclusive rights to your work, of course. (If you give it to them, into the safe it goes.)

Oh and may the gods help you if you work, as your boss makes granting him exclusive rights to all your work in perpetuity a condition of your employment, and he will be quite litigious if he thinks you're holding out on him.

I must go now, as apparently BigBadCo owns the rights to all deities, and I seem to owe them $1000 for that last reference.

Monday, March 30, 2009

Street View Backlashes

When Google (who hosts this blog) started their Street View addition to their map program, everybody loved it, right? After all, now you could plot a trip to somewhere, witnessing it without even leaving your house.

Well, everybody but these guys in what I suspect to be America, but could just as easily be some other drive-on-the-right-side country. They got caught doing something they hoped nobody would notice.

Google has also found that other countries have very different ideas about privacy than their own United States. In Japan, the crowding is very great, and personal space is at a premium. So there, even though the house faces the street, looking directly in is forbidden. London in the UK likewise freaked out. They may be riddled with cameras, but those are only seen by the police, and the information gets dumped after a while.

I can sympathize with the complaints, which mostly revolve around the fact that every person has unfettered access to the feeds. Given enough details, wanna-be burglers could use this as research for break-in points. Other houses often have large picture-windows, allowing essentially anyone to glance in. I don't want to have to live in the kind of society that thinks that if I didn't specifically bar and lock it, that it's freely accessible to anyone who feels like it. After all, if I leave my car unlocked, that doesn't give people the right to hot wire it and drive it off.

What Google's responsibilities in this are unclear. Already they have re-recorded a number of streets because they happened to catch something embarrassing on their pass through. (And because the person depicted complained.) Politicians have gone so far as to demand that Google blur all buildings (on the grounds that knowledge of the building's configuration would be enough to plan a terror attack on it). There are even people who insist that Streetview should not exist. However, other people love the information and would be at a disadvantage without it.

Obviously privacy rights will differ from culture to culture, but can Google make one system that works with all of them?

Friday, March 27, 2009

The Robot and the Beast: A tale of human development

Humankind developed from animals. Many people would like to claim otherwise, but wishing doesn't make it not so. Our past has left quite a few legacies in our minds and bodies. Most importantly, emotions. Emotions kept our animal ancestors alive and functional.
A common sci-fi trope is humans being replaced by robots. Robots would have minds like us, but would be mechanical-based, lacking the emotions that served our animal friends because it would serve them no purpose.
Together, I think this serves a good model of what humans as people are like. Emotional like animals, but also logical and abstract like robots. It would be wrong to remove either trait, as that would reduce us to the other one. A person with no emotions is a poorly made robot. A person with no thoughts is a crazy beast that must be confined to a cage for the good of the rest of us.

Tuesday, March 24, 2009

Other Mad Engineers

When it comes to building machines that make little or no sense, I'm not alone. Let's go over a few that I found with a quick search of Google, ignoring the matches that were just people bragging about their "mad engineering skillz." (Which they may or may not have actually had.)

There's another mad engineer right on our own blogspot. He's of a rather different political persuasion than I am.

A Mr. Quackenbush has proven quite the inventor, boasting 7 strange inventions, many with detailed instructions.

A man named Marv may not have come up with the ideas on his own, but was kind enough to point out the 7 wonders of the modern world. Thanks, Marv.

In England, an entire company has dedicated themselves to making your car more interesting. Although clients are requested to keep at least a modicum of sanity when ordering, because the "mad" part is actually an acronym.

A Polish company would like an engineer who can speak German and English. The madness of their science cannot be evaluated by me, since I speak neither German nor Polish.

DragonconTV invites you to learn more about mad scientists in general. The more you know...

Lumrix presents a tretise of mad scientists that's strangely similar to the one on Wikipedia. Unlike Wikipedia's copy, it notes that few mad scientists venture into civil engineering, geology, metallurgy, math, or the social sciences. Perhaps because a script revolving around those fields would be incredibly boring, but I am now inspired to write articles about those subjects.

Mr. Mudnoc of Atlanta is a mad...sound...engineer. I'm not a member of Myspace, which has forbidden me from listening to his work.

Dr. Anderson isn't a mad scientist, he just loves it a lot.

Ms08tx ... um, I can't read Japanese, but I think this person works in IT. Speaking of Japanese postings, any idea what this is?

Neatorama would like to remind you how to fold a shirt as an engineer would do it. And the way I've advocated doing it.

The other entries were mostly people talking about mad cow disease.

Monday, March 23, 2009

History Repeats Itself

Over 160 years ago, during the administration of John Tyler, the Whig party had a major problem with the official Bank of the United States. (Yes, there was an official one back then.) So when it came time to renew the official charter, the Whigs in congress completely rewrote it to fit their own ends. They figured that since President Tyler was one of them, he'd sign it.

He didn't sign it, he vetoed it. The new charter would leave the bank a bad investment for anyone who owned stock in it, which would kind of doom it to failure. Having the bank that the government kept its own money in fail would be bad. He also thought it wasn't distributed enough, and therefore excessively risky. He asked congress to write a charter that was more business-acceptable.

So the Whigs in Congress started again, starting over with a copy of the old expired charter, changing it until it fit their needs, and told the president that he needed to sign it.

He didn't sign it. Same problem.

"AHA!" yelled the Whigs towards the press and therefore directed to the public at large. "You see, he's CONSPIRING AGAINST US! He's trying to ruin the bank! He's trying to bankrupt the country!"

Over the next few weeks, it became apparent that the public did not give three hoots about this, (or even two,) and were increasingly angry with the Whigs for their continued bickering over minor crap while the bank teetered over the edge of annihilation. And lo, over the next 40 years, the Whigs diminished, being totally irrelevant by 1850, and totally gone by 1856.

In 2008, the Republican candidate lost not only the presidential election, but most of the congressional elections. I have meanwhile witnessed Republican panics about everything from taxes to immigration. Mostly taxes. Many of them seem confused about the loss of the election, and are sure the Democrats did something to cheat.

And the public, so far as I've seen, wants the economic crisis to be over and doesn't give two hoots about taxes until it is over. That "He's going to raise your taxes!" didn't fly during the election seems to have deeply confused the Republicans. I think this tax-obsession of theirs is endangering their continuation as a party, since they can't seem to make another plan when that one doesn't work.

In the event that they go down, I predict another pro-business party will take their place. It will also favor low taxes, but will be willing to raise them to win the election and pay for the election promises. (And cut them later, when no one's looking.) After all, the people who voted for Whigs didn't go away in 1856. The new party will have none of Norquist's quasi-libertarian ideas and be all for things that promote the success of multinational corporations.

Why does that scare me a bit?

Friday, March 20, 2009

Fishtank House

What if we combined a fishtank, and a house? Instead of solid opaque walls, the walls would be made of a clear polyvinyl, the difference between them filled with water? Doorways would likewise be clear, and stepping through one would take you beneath a foot of water. Electrical and phone lines would need a waterproof casing, of course. The water would need circulation via pumps, filtering as to not become algae riddled or toxic, and a combination of fish and plants should be introduced that won't horribly destroy each other. Preferably for lower maintenance, a food-chain should be established to minimize the need to feed the fish.

Actually, let's make this an office instead of a house. Bedrooms, bathrooms, and changing areas would need opaque walls anyway, and doing this for just the hallways, living room, and kitchen doesn't seem worthwhile.

So we'd have this office complex, and the walls and ceiling are not only filled with water, but have fish living in the water. Aquarium plants sway in the undersea currents, and fish idly observe the employees and the boss working hard all day. The tank would be maintainable from the next floor up, which may or may not be the roof depending on the size of the building. Bubble stones would be in strategic areas of the tank's floor, merrily bubbling away.

Would the aquatic tranquility reduce office stress? Would the fact that people can see into the boss's office encourage transparency in other areas in the workplace? Would the fish multiply, and if so, would they be harvested or merely allowed to reach an equilibrium? If they were harvested, would they be company property?

Would this even function as an office at all, or would it just be yet another expensive avant-garde art display?

Wednesday, March 18, 2009

HURD

So this week that I have off, I've used a few moments to try and run the experimental operating system HURD. HURD is a Unix-like (think "Linux"), but run by a microkernel, and implementing many of Richard Stallman's (the guy who wrote the GNU utilities that make Linux-based systems worth running),ideas about how computers should work.

Hurd has been under heavy development for more than 15 years, and still barely runs. Part of the reason for this is because they've never had more than 10 developers, almost all of them part-time. Linux draws all the spotlight, glory, and attention. However, I did get it to run, so let me tell you about the experience.

Most manuals suggest installing with ext2, which I did. Then I tried to start it. The ext2 server cannot seem to start up from the hard disk, no matter how hard I try. I was only able to boot it from the livecd, which was from there able to run the ext2 server to read the hard drive. This is bad. The error message is also remarkably unhelpful, claiming that there is a "gratuitous error." The web's suggestions about that only say that it is very bad and shouldn't ever happen. I can think of a number of things that could have gone wrong, but have no means to test them.

The system is supposed to be a Debian, which is fairly unfortunate. Debian is an excellent system for people foraying into the unix-likes, supporting Linux, HURD, BSD, and other platforms, but it does make certain assumptions about the system that made it literally impossible to function. I could not install new software to the hard disk because the root area is a cd, which is read-only. It assumes that if I want to work in a sub-area, that I can wall myself off with unix's "chroot" command. This, for some reason, crashes the machine. Other distributions have the ability to install things into a "new root," helpful for setting up subpartitions, new installations, and so on, but Debian's developers seem to consider this unnecessary, and provide only extremely minimal support for this.

Since last time I tried, they have thankfully fixed that bug in which pressing any key during the system startup would crash the machine, requiring a reboot. They have not fixed the bug in which reading a disk partition requires copying the entire thing into memory. Apparently, HURD developers have either extremely small hard drives, more extensive RAM than the average user, or both. While fixing this isn't within my prowess yet, it shouldn't be too hard for a systems developer. (I know exactly why they do it this way -- it's easier.)

HURD works very well for a UNIX-like system once it does get up and running. It supports many of Linux's filesystems, and has an easy framework for writing new drivers. Plans are in effect for exotic filesystems that no other OS would even consider, such as an FTP-based one. This would allow a person to treat FTP directories as part of their hard disk, making maintaining webpages and so on super easy. The microkernel architecture ensures that drivers shouldn't crash the system, since if they fail, they can be restarted or replaced. The system would also work great for upgrading, since a reboot is only necessary if the kernel itself is replaced, which would be very rare. Compare this to Linux (in which all drivers are in the kernel, reboots are needed to upgrade) and Windows (which needs reboots after installing new drivers, new software, new wallpaper, and if Steve Ballmer feels like it).

I would say that at the moment, HURD would work for a systems-programmer. It is still 3 years from being accessible to programmers in general, 10 years from power users, and 20 years from the populace at large.

EDIT: Next month, I got it to work. And then promptly broke it by attempting to upgrade it. Nice going tightly coupling everything, guys.

Tuesday, March 17, 2009

The Autism-Vaccine Debacle

There's a continuing news story that I hear quite a lot about, and that would be the continuing claim that contaminants in vaccines, such as the preservative chemicals, are the cause of an ever-increasing stream of autistic children. Or even the very vaccines themselves. I'll try to cover this as neutrally as possible.
Autism is a developmental disorder in which a person does not develop socially, communicates poorly, values their own inner experience far over social consensus (and possibly even over the external world entirely), and has only a few interests that they follow obsessively. Having autistic children is quite distressing to the parents, since their child doesn't respond to social signals like hugging, and often ignore the parents entirely in favor of manipulating objects. Autism is a spectrum disorder, meaning that there are varying amounts of it, a person could be only a little autistic, very autistic, somewhere between those two, or not at all.
Autism is first noticeable in early childhood. At an age where children first start talking, the child does not talk. The child moves strangely, and insists on playing in a few ways over and over again. Upon being directed to another game, the child will return to their own preferred method instead.
The disorder has been known for long enough to watch people grow into adulthood with it. Adults with more minor autism often intellectually learn the social skills needed to succeed. They have been known to have trouble with relationships, as they had a late start with social skills, and to often be strangely good at art, animal management, or some extreme talent based on their interests. They are often described as reclusive, eccentric, odd, and obsessive.
Parents of autistic children have been frantically searching for a cause that they can blame it on, since when this disorder was first discovered, it was blamed on the child's mother being cold and unemotional. Some parents fear that this is still how it would be understood. Vaccines have often been blamed, since the first major vaccine cluster is usually issued around the same time that the disorder becomes apparent. Since one of the common preservatives used during the time when autism was first discovered included Mercury, a substance known to cause brain damage, the Parents blamed the vaccine for the condition of their child.
The preservative, thimerosal, was quickly removed from vaccines, but no change was observed in the rate of birth of autistic children, which only increased. The increase was particularly noted in silicon valley, in which many geeky people had settled and begun to raise their family with other geeky people. Nerdiness may be a very slight form of autism.
Autism very likely has some genetic component, since geeky people are more likely to product autistic children. Food may also cause or aggravate the symptoms, as Autistic people have been noted to also have any number of gastrointestinal disorders co-morbidly (that is, together with the autism in a way that suggests a link), and opioids (like poppy and sesame seeds) have been noted to worsen symptoms in one group studied, which seems to suggest that a digestive or metabolic defect may be at hand.
Although many parents continue to claim that vaccines are causing their children's problems and are doing their best to discourage vaccines, this is a bad idea. Vaccines are reducing the rate of disease propagation in general. If everyone ceased vaccination, disease would spread very easily. This problem suffers from the "free rider" problem in that the activist parents who refuse to vaccinate their children do not experience the negative effect of infect-able children because most people do vaccinate their children, which stops diseases before it reaches the infect-able.
While Autism has no cure, cognitive therapy has granted many people who have it a normal healthy life. And some people who have it are very strange indeed, but they seem happy enough.

Saturday, March 14, 2009

Happy Pi Day

In America, today's date is written 3/14. Yes, I know other countries do it differently. Anyway, the 3-14 part is reminiscent of the first digits of the mathematical constant Pi, 3.14. Pi appears quite a bit in engineering, certainly if you're working with anything that is in any way circular or round. So I'm going to give you some facts about pi.

Pi is irrational, meaning that it cannot be represented as a fraction and would go on forever if written down in our traditional decimal format. It is also transcendental, meaning that it is also not the square root of any rational number either. There are hypothetically an infinite number of irrational transcendent numbers, but only a few have any real use.

For most engineering concerns, 355/113 is close enough. This is 8.49 x 10^-6% (.00000849%) away from the actual value. Engineers would never need more than 39 digits, which would calculate the diameter of a circle the size of the entire universe and not be more inaccurate than the width of a proton. However, pure mathematicians have, for the sake of accuracy alone, calculated pi down to quadrillions of digits.

Pi was discovered at about the same time by the ancient Greek, Babylonian, and Chinese mathematicians. Accuracy of Pi has improved over the centuries, but perfect accuracy isn't needed for many engineering procedures. The Babylonians had it recorded as 25/8 (3.125), but still had circular columns.

If you have a computer or calculator available to you, and it for some reason does not have a pre-recorded Pi constant, you can generate Pi as 4*arctan(1). Make sure your calculator is set to radians, not degrees.

Pi also shows up in a number of statistical functions.

Thursday, March 12, 2009

Auto Parking

A classmate of mine tells me he wishes that his car would park itself. To some degree, this already exists. Some luxury cars now advertise that they have the ability to paralell park themselves, as this is a tricky manuvre that boggles the minds of most drivers. It also lends fairly well to automation, since the same technique works every time and the system can rely on the human driver positioning it in such a way that it won't smash into another car. (The car cannot, as of this point, see.)

Still, most other parking procedures are equally straightforward. The car can have pre-defined procedures that pulls it in straight forward and stops, slides into a diagonal parking space, and the already discovered "paralell parking" technique of moving forward, diagonal into the space, and then back forward into the parking space, at which point the car stops.

Drivers would need some training in the use of these procedures, as well as some kind of way of indicating to the car which to use.

Tuesday, March 10, 2009

The Retrofuture

It's always a strange experience to look at past predictions of what the future would be like.

In the 1930s, the Great Depression made people think that the future would be worse, because there was no sign that it would ever, ever end. People predicted more economic nightmares, death by robots, fascist control of the entire earth, and so on. Few people were predicted to survive. Those that did would presumably look forward to being dead.

In the 1950s, fresh from allied victory and excited about space, people predicted an optimistic future. An article talks about a typical family from the year 2000. They wear suits made of what looks like plasticized metal, live on the moon, and eat re-hydrated foods, if not outright food pills. (Which one was depicted depended on the writer). Of course, human imagination being what it is, the predicted future-tech looked very much like logical extensions of 1950s technology, with a tad more automation. People still cooked food in ovens, pushed vacuum cleaners, and prepared two children for school, a boy with straight blond hair, and a girl with curled brown hair, a skirt, and a rag doll. TV remains the rage, in slightly smaller cabinets and still black and white. Russia would remain as a rival, remaining neither beaten nor victorious.

In the 1980s, America's economy was down, and Japan's was up. Every prediction of the future then involved Japanese domination. If America was not predicted to be outright conquered, it was predicted to be economically dependent. All the corporations would be Japanese. All your bosses would be Japanese, and you'd better hope for your continued employment's sake that you knew how to communicate with him or her. It was also predicted that a cultural absorption would occur. This idea was only scuttled in the mid-90s, when Japan itself collapsed economically and America recovered. Japanese culture remains famous with America's nerds, but not for the same reasons.

In all of these, fashion is predicted fixed at their own point, paradigm shifts are unnoticed, and the era's big obsession is assumed to be a permanent driving fixture from then on. This failure of human imagination is one reason why I don't want to imagine the future. The 1930's writer couldn't conceive of microwaves, be they used in a cooking device or communications, the 1950's writer could not understand the very concept of the personal computer, nor the collapse of the Soviet Union. That space travel would seem passe after a few important landmarks had been reached did not resonate. (And people were quite willing to endure the expense when national prestige was on the line, but very unwilling once no further bragging rights could be extracted.) The very idea of the Internet would have seemed actively insane before 1980 or so. (Yes, ARPAnet existed, but few people even knew what that was.)

So if you'd ask me to predict the future, I predict that by 2015, some new thing will occur that will change the way that we see the world. I cannot predict what field it will be in. I cannot predict how fashion will change, other than what is currently favored will seem dumb and something different will be favored. (Not necessarily new, fashion sometimes has retro-moments where fashion is copied from an earlier era.) I predict that the US 2012 election will slightly favor Democrats over Republicans, and the 2016 election will favor Republicans, if they haven't destroyed themselves by then.

I predict the election the way I do because US politics tends to be cyclical. People slowly get angry with one party's errors until that party can no longer succeed and loses power. There only tends to be two parties at any given time because of the way the electoral system works. If one party gets too far out of touch with the populace at large, it fails until completely destroyed, and some other party typically rises to take its place. The first party to be destroyed was ironically the founder's own. The Federalist party's positions had become completely moot by 1820, and it died, replaced by the Whig party.

Monday, March 9, 2009

Economics rears its head again

The United States is in a depression. As is, to my knowledge, the rest of the world. People have still not recovered from the low point after the sub prime crisis. While there is a very good description of both how the crisis came to be, and what effects it may have for us, vicious fighting is now going on as to what to do about it.

As far as I can tell, the crisis currently revolves a lack of confidence on anyone's part in the economy. Consumers are afraid to make purchases, as jobs are hard to come by, and replacing the money they spend is not happening. Companies are afraid to hire, because purchasing is down. The two endlessly feed each other in some kind of bizarre catch-22 situation.

Banking is a big part of this. The banks have $2 trillion in debts that they effectively cannot ever recover. In other countries, nationalization would be called for. This is unpopular here -- many people see nationalization of private companies to be the first step towards communism, a feared ideology. Ailing companies could be bailed out, which gets a lot of flak because this situtation is their own fault in the first place. Failure could be permitted, as bank depositors are protected under FDIC, but this would be more expensive still. FDIC does not provide its insurance for free.

Strangely enough, one economic group has recovered. The electronics industry in Taiwan reports vigorous success today. There are some industries known to do well in depressions, mostly those having to do with escapism. Movies, alcohol, and theme parks are predicted to do well also.

If one has money, now is a good time to do many particular moves, such as buying land, stock, or bonds. This situation cannot last forever, but it is quite grim and worrying for everyone. Especially because it is world-wide. No country has completely escaped.

My proposal is to treat this like the Great Depression, and stimulate the economy by massive infrastructure spending. The infrastructure workers should spend the economy back to health.

Tuesday, March 3, 2009

Roadposts

I have more than 100 entries now. Which is awesome, but I'm sure you wanted me to come up with some sort of insane invention.

Fine. I propose a highway consisting of uncountably many little conveyor belts, thereby distributing the energy needs of propulsion between the car and the city power grid.

What's that, the tires might get caught in between the belts? Or the belt would have to be implausibly long? And getting on the offramp would involve an abrupt deceleration of the speed of the belt?

If this seems like a bad idea, it's because I thought of it when I was ten and had studied neither mechanics nor physics yet.
Related Posts Plugin for WordPress, Blogger...