Thursday, April 28, 2011

Look, an Educational Halo Video

The above video I created as my final project in Composition & Theory. To make it, I used Adobe Premier Pro, and many, many hours of time. The video is an exploration into the arguments between composition and literature. If many of the jokes are over your head, that's because you aren't in an English grad program. Or you like to go outside. Either way, this was terribly difficult to do, but ultimately very fun. Enjoy!

Thursday, April 8, 2010

On that whole iPad thing...

I recently had someone ask me what I thought about the iPad, to which I replied something along the lines of what is to follow here. At first I thought it was a waste of time and money, but I think it's much worse than that now. It's a waste of conversation (yet here I am talking about it, oh well). The fact that it's the newest "thing that people want without knowing why they want it" makes matters much, much worse. When the iPod launched, and then subsequently became actually usable and not-too-bad, everyone wanted it, but at least there was a real purpose for it -- music on-the-go, and a lot of it in one place. The iPad is neither an iPod nor a computer, and so sits in a weird limbo beckoning to people who are too stupid to really put any research into the thing. It can do some computer stuff and some iPod stuff, but not everything (well, it can do everything an iPod can do, minus the portability). Anyways, let's stop and look into why a normal, let's say average computer user should never get an iPad...

1) Cost. It starts at $500. If you want 3g capabilities, it's $579, and then $30 a month for the unlimited subscription to that service (no contract, though, which is kinda cool?). What does that $500 get you? Not a lot of storage, for one thing--the $500 model has 16 gigabytes of storage. Here's a flash drive that costs about $35 shipped that also has 16 gigs of storage. Want to double that to 32? Then be prepared to pay $100 more. One Hundred Dollars. To put that into perspective, Here's another drive that costs $70 shipped and holds 32 gigs. Somehow thumb drives have managed to double the storage for 35 bucks, yet Apple will happily charge you 100. Interesting, isn't it? Now, if you want the best iPad money can buy, you're looking at over $800. For something that isn't a computer. It boggles the mind. Then again, Apple always charges tons of money for everything they make

2) Usability (this is a big one). Apple will have you believe that this thing is the most versatile thing in the world. it isn't. It's a big iPod Touch. It has the same app store that the iPod Touch / iPhone has had for years, and it still can't run multiple applications at the same time. Want to check your email when you're in the middle of reading a book? First you have to close the book app, then go into the email, then when you're done, close that, then go back into the book app. Can't minimize, can't multitask. Here's a quote from Engadget's review of it:

"For starters, as we mentioned earlier the iPad doesn't support multitasking, save for Apple's own applications: Safari, iPod, and Mail. Everything else you use on the device is a jump-into and then jump-out experience, which means that for things like IM apps, you're either having a conversation or you're not. For those of us who are used to the iPhone way of doing things, that's at least familiar, but if you're looking to have a conversation while getting your email in order (as you would on a laptop), you're out of luck."

Apple came out and said that users "don't want that" kind of usability. On an iPhone, where it's something small and you're probably doing one thing at a time anyway, maybe, but if you're going to release a product claiming to be "magical" and "revolutionary" (their actual words, check the website), you'd think it could actually do more than one thing at a time.

EDIT: Apple just announced the new iPhone OS 4, which will support multitasking-ish capabilities. Not 100% multitasking like netbooks/laptops/real PCs, but kinda close. How nice of them to implement something that everyone else has been enjoying for years, but make it seem like some kind of amazing achievement. Bravo? Still, I'll say what everyone is saying: FINALLY.

Moving on, the thing is a complete pain in the ass to hold in your hand. Think about it. You're going to be holding a 1.6 pound, rigid, metallic minicomputer when you want to use it. There are no grooves for your fingers or hands. There is no way it can bend or slip comfortably anywhere. Now think about trying to use the keyboard on it. How will you type? Prop it up on your leg? Use one hand to type? Let's say you figure you'll lay it flat on a table -- oh wait, Apple has made it sleek with a nice-looking, curved backing, meaning it'll wobble every time you try and type something on it while it's on its back. You know something's wrong when less than a week after it's been out, people are making stands for propping the thing up. I think actually physically using this would never be comfortable.

The internet browser on this thing is great, though -- it's fast, it's smooth, it's fluid. Okay. But here's the thing: most websites use Flash. The thing needed for any internet game website, or in some cases, entire internet sites themselves. This device does not support Flash, at all. And it probably never will. Any website that uses Flash either won't work at all or will work with very limited browsing capabilities. Let me say that again: Any websites that have Flash as their main way of conveying data to you will not work ever. That's insanely stupid, and honestly boggles the mind considering Apple claimed the internet browsing experience on the iPad to be the best you could ever have (their actual words).

Apple's also really pushing the whole "Look! It can play games!" thing. If you've ever tried to play a game on an iPhone, you should know it isn't the easiest thing to do most of the time, and is heavily dependent on what kind of game you want to play. Something like Scrabble would probably be fine, but playing a platformer? Please. Touch-screen controls are not at the perfected state yet where they will flawlessly do what you want them to. Now, the iPad does have a better interface/responsiveness, but the aforementioned difficulties in holding it, combined with a very prominent home button (meaning if you hold the iPad in landscape mode, you'll more than likely accidentally hit the home button, exiting the game and pissing you the hell off) means that, combined again with the fact that the device isn't exactly pocket-friendly or "mobile" like a PDA/PSP/Nintendo DS means gaming on the iPad is virtually just a bullet-point. Why would you spend $500+ on something that isn't ergonomic towards playing games? You wouldn't, thus, it is something that you'd only do because you have the device anyway.

3) E-Book Reader. As an e-book reader, it works pretty well. It has a lot of publishers on board, and a beautiful screen for displaying the e-books. Here's my advice of caution on this subject, though. For one thing, you must understand that the books you own are never really "yours". What does that mean? Well, look at it this way: When you purchase a book from Apple's iBook store, you purchase it to your account to be used on your iPad (or other iDevices), and that's it. You can't use it anywhere else, and if for some reason Apple doesn't think you should have it anymore, they can take it away from you. "No they can't!" you might think. Yes, they can. There was a major issue with this about a year ago with Amazon's Kindle. Ironically, it was with the book 1984. Basically, Amazon found out that it shouldn't have listed a certain publication of the book as being available on its e-book store, so, they removed it from the store and from anyone's Kindle who bought it without telling them and with no notification. Yes, those people got their money back, but the whole incident showed that when you buy something for your e-book reader, it isn't ever really yours; you're simply buying the right to look at it on that piece of technology. Apple is no different from Amazon, as it has been discussed in the past regarding the iPhone that if Apple wanted, they could disable any iPhone remotely via a "killswitch". This was all over the tech sites shortly after the iPhone launched. Has Apple ever had to use it? Probably not, but it's unsettling to know that they could and you'd be screwed.

Personally, I like to hold a book, knowing it's mine, and if I wanted to, I could lend it to someone else, or, better yet, 20 years from now I can pick it up again and enjoy it exactly the same way. You might throw a joke here right now like "well, I don't know if I'll be around in 20 years." Well, haha, but think about 6 years, or even 4, there will probably be a newer, better iPad that's faster and makes this one look like an 80's cell phone by comparison. Now you have to buy that (another $500?). Sure all the books you downloaded can be used on the new one, but there will be a transfer process of some kind, and you still had to buy a new piece of hardware just to keep up with reading books. Or you could just buy the book and keep it on a shelf forever.

But all that aside, the only other "problem" with the iPad and reading books is the screen itself -- it's a backlit screen. Some people, after reading text on a computer screen for an hour+ tend to have eye strain issues. The reason why the Kindle is so spiffy is because it uses something called "e-ink" instead of a normal, backlit LCD screen. This "e-ink" pretty much mimics how text looks on paper, with no backlighting, and with no strain on the eyes. Now for some people they will never have an issue with the iPad's backlit screen for reading. Some people, this is really a personal preference thing.

4) Minimal Design. Apple is known for the sleekness of stuff. Okay. Well, that's fine and all for an iPhone or an iPod, but with this thing, which is pretty much claiming to be a kinda sucks. There are NO ports on this thing. No usb, no ethernet, nothing. Actually I take that back, there's a headphone jack. If you want to have any of those ports, you have to get an adapter. Maybe you'd never need any of that anyway (like an ethernet port), but there isn't even a card reader on it so you can quickly and easily load pictures to the thing. Even the cheapest, slowest netbooks have that stuff. Some people like the minimal design because it's less to worry about, but I think it hinders what this thing could really be capable of.

Now, all of this isn't to say that the iPad is a terrible, terrible device that can't do anything that anyone wants it to do. What it does, it does pretty damn well. It works fast, smoothly, and is basically designed with idiots in mind, meaning, anyone should be able to pick it up and use it. I just think that for the price, you could easily get a laptop that does everything the iPad does, with a bigger screen and 300 times more functionality. And if you throw in the thought of getting a netbook, then your device, I'd say, is straight up better.

But my real suggestion would be to go and hold one somewhere if you can, and use it, and see if after the initial "ooOOo, look at the pretty screen and metal-ness of it" wears off, you could see yourself using it all the time, or for an hour straight, or for seriously typing on, or...etc.etc.

This is my 2 cents. I'd never buy an iPad, and if I were given one, I don't know if I'd use it. I'd probably sell it to an Apple fanboy, get a netbook with a 1.66 Ghz, dual-core CPU, 2 gigs of RAM, a 160 gig HD, and use the 125 bucks I'd still have left over to get something else that's a few videogames, or a new hard drive for my desktop (the iPad has a 1000 Mhz CPU, 256 megs of RAM, comparatively). Sure I wouldn't look as cool as the Apple fanboys do, but fuck those people.

Thursday, December 10, 2009

Somewhat Modern Warfare 2

A lot has been said about Call of Duty: Modern Warfare 2. I mean a lot. And most of it comes from editors required to spit out an opinion, or from people whose game-playing experiences are (surprise!) pretty much only their experiences and no one else's. This (as is usually the case) means that a large demographic of people are listening to a very small demographic of people in order to formulate an opinion on a playing experience they have yet to...well, experience. Allow me to thus add fuel to the fire in one way or another, and give my sweeping opinion on the MW2 scene.

Is the game good? You bet. There, now that that's out of the way I can move on. Oh, you want me to elaborate? Fine...

If you played Call of Duty 4: Modern Warfare, then you pretty much completely understand everything that there is to understand about MW2. MW2 takes all of the ideas that were initially established in CoD4, expands upon them, elaborates them, makes them prettier, and then gives them back to you for $60 more of your money. Is that worth it to you? I wouldn't say it was worth it to me, but my 48+ hours of playtime so far would beg to differ.

The graphics are better, which is a given. The multiplayer has more game modes and options, which is a given. There are more guns, which is a given. And the single player is way more out there with oodles more "offensive" material, which, again, is a given. This game is the definition of a sequel. It stays within its predefined skeletal structure, but adds a lot more meat to those bones. So, now that we have managed to cover exactly what all other reviews have covered within two paragraphs (ha!), let's move on to where everyone's panties are in bunches. Bunches in people's crotches. Uncomfortable bunches.


Anyways, the multiplayer. If you've played the Xbox 360 version of this game, ignore everything I'm about to type, because your multiplayer experience is the same as Halo 3, CoD4, and probably a zillion other 360 multiplayer shooters out there. You pay your $50/yr for decent multiplayer service, and that's what you get.

But us PC players who like to play their shooters the real way (read: keyboard and mouse) kind of really have the shit end of the stick a little bit. But here's the thing...for about a week before buying this game I read...and read...and read about all the problems this game has with multiplayer. Forum posts, reddit comments, editorials, you name it, all blasting the multiplayer for the PC. Saying that it's ruining PC gaming...that this is just the start. That after Activision sees the revenue from this game on the PC, they'll understand that PC gamers just don't give two shits about their online experiences anymore, and games will now be just as "broken." Yet I bought it anyway like everyone else.

But is it really that broken? In a word: no.

Now, now. Don't get me wrong, I think that the system they have implemented is full of problems, and those problems are fairly consistent. But no where near the nightmare that everyone was lead to believe (or at least the one that I was lead to believe). Here's where the problem started: no dedicated servers. Infinity Ward (the developers) decided that, for some reason, PCs should now become Xbox 360's and no longer give you a list of servers to choose from. Thus they implemented a system that chooses a host at the start of every game, and that host (a player IN that game) then becomes essentially a temporary server for that game (and maybe the next, and the next). That might sound okay in theory, but if for some reason the game chooses that one guy on the planet that still uses fucking dial up, or that other dude playing in the most north eastern tip of the US, or that little kid who's trying to play this game on his mom's 5-year-old Dell Inspiron desktop, everyone is pretty much screwed.

Dedicated servers were/are a benefit because they have excellent internet connections. They are localized, too, to an extent. So if you're on the east coast, there's a good chance you can find a dedicated server that's on the east coast, too, and you've got yourself some smooth sailing ahead. Or, if you're playing with friends across the country, you can all find a server that's in-between everyone, and everyone can have decent connections. It was simple to navigate, and a system that's been around for a very long time. And if you all found a server you liked, you could just add it to a list of favorites, and sleep easy knowing that when you woke up at 2 in the afternoon to start your next 14-hour long gaming session you had a safe, happy place to go.

Also, dedicated servers have admins. Admins can ban people...people who cheat, or, as I like to call them, people who like to fuck cacti for pleasure. These cacti fuckers are always a problem in games...but if you played on a decent dedicated server with a decent community, there was a good chance there'd always be an admin in the server to ban these people.

But alas, all of this is gone in MW2. So, in theory, you would expect games to be slower than the slowest shit on the roughest ground that's as flat a paraplegic's ass. You'd also expect everyone to be fucking cacti because on the internet, everyone is a total, total asshole. But you know what? That isn't true.

The game's hit detection (when the game determines you hit someone with your bullet/knife and where) is local, meaning on your computer. So, unless the game connection is terrible, things are pretty smooth. I've played a good deal of games with easily over 140 ping and been fine. How often are game connections really, really bad? Not often. In those 48+ hours I've spent with the game online, I'd probably say a really shitty connection has happened maybe 20-30 times. That's less than 2% of the time. I can live with that.

If the game's host leaves, the game picks a new one. This means the game pauses for at most 20 seconds (though usually around 10), and then goes right back to where it was. I haven't seen a problem with this yet, though I'm positive that problems can easily come to fruition (someone leaves, the game picks a new host, that person leaves, game picks a new host that has a shitty computer, etc.).

But the cactus fuckers are legitimately the most irritating and prevalent annoyance. There are hackers...lots of them. And because of this, everyone showing any amount of skill raises suspicion. You're almost guaranteed to come across either a wall-hacker (someone using a hack to allow them to see through walls, and thus enemy positions) or an aim botter (someone using a hack that automatically aims--and sometimes fires--their gun at enemies' heads) at least once per play session. And because there are no admins, these people often do not get banned. I say "often" because technically the game uses Steam's "VAC" system to weed out hackers...but how reliable VAC is and how soon after a detected hack it bans an account is unknown. I'd say it isn't too great a system.

What does this all boil down to, then? Well, the game is great...the multiplayer is fun to play, but there are problems surrounding that experience. Is the game as broken as everyone has been complaining? No. It just isn't. Infinity Ward's match making system allows for quick-starting games that you can easily set up with friends, assuring full games every time paired with a good variety of gameplay modes (most of which were already there in CoD4, but whatever). That really isn't a bad thing. Want to play Headquarters Pro? You'll be in a game in server hunting required. Not that "server hunting" was any bit of a difficult process to begin with, but...a positive is still a positive.

Besides, if you want to play the new CoD on a PC, you don't have much of a choice, do you? I bought the game because I have friends to play with, and I like to have fun with cool people (you do want to be cool, too, don't you?). If you're all by yourself, then maybe this game isn't for you. Or maybe you should look into fucking a cactus.

I do think that this is ultimately a step in the wrong direction for PC games, though. If I wanted to play a console game, I'd buy a console game. This game feels the same (it even has the same price), but with a different control scheme. Infinity Ward, it seems, was lazy. They could have easily implemented the same server system in CoD4, but also allowed the quickplay party system that's in the game now, if they wanted to. It wouldn't have been difficult, especially considering that it's fairly obvious that they just imported a dumbed-down version of Xbox Live's matchmaking system to the PC. This new system is geared to dickhead hackers and frustrating lag issues that honestly shouldn't be there--whether or not they're really prevalent isn't the issue...they should not be there in the first place.

So, that's what I think. And yeah, it's just another opinion to add to the heaping pile of them that already exists, but I think it's been long enough and I've played the game enough to have formulated what I've written...maybe moreso than some of those day-1 reviews. Oh, and even in games where there are multiple cacti-fuckers, I still usually win. So, either the people using the hacks suck (which is why they are using them), or it doesn't really matter that much. I'm going with both.

Till next time.

Thursday, September 24, 2009

BFG is amazing.

Fuck you, I know I haven't posted in a million years. I don't really care.

Erm, I mean. Hey! Long time no see! How's it going? Good? Good! Glad to hear it. Still got that birthday card I sent you? No? Oh that's right, people usually throw those out after a week. No, no, I don't care. I understand. What's that? Oh, her. Yeah she's pretty great. I know, I know, one in a million. Uh huh. Well, look, I have a lot of stuff here, and this basket is getting kinda heavy, so... Oh yeah, yeah, just gimmie a call when you're in the area. We'll totally do something. Alright. Later.

Man, that was awkward.

Anyways, let's get to the point. A year ago from this part March I ordered and received a Nvidia Geforce 9800 GX2 made by BFG Tech. Yeah, it had just come out and man did I shell out some money for it. But I figured it'd last me a good amount of time, so, whatever. It was a pretty good card, but it started to have some pretty lame issues a few months in. Things would stutter in certain games -- usually flame effects. And those are the best kind of effects, amirite?!

Then games would crash. And then my computer in general would crash. Finally, a month ago, the card made Windows cry, and it no longer would boot with it in there.

So, I figured, "Shit." I figured that not just because the card was shot, but because it wasn't under warranty. See, BFG offers a lifetime warranty, but with this card, you had to register within 30 days of purchase to receive this lifelong treatment. I did not do this because I found the piece of paper telling me so about a week after said 30 day period. Me not being one to cause a fuss (and assuming my card would, ya know, work) shrugged it off and that was that.

Well, it died. I replaced the card thinking I was screwed. Man was I wrong.

I contacted BFG via a friend's suggestion just for the hell of it last week. And they didn't ask any questions and instantly gave me an RMA for my card. I was very impressed. But then it got better. Within 24 hours of receiving my card they had a replacement in the mail for me. But replace it they did not simply do. Oh no. I got a Geforce GTX 285 as a replacement. That, ladies and gentlemen, is almost the best card that Nvidia currently makes. And, it's a pretty significant upgrade. All for free. Amazing.

So, if you want to buy a graphics card, buy from BFG. Cause if it breaks, they'll give you a better card. And if it's out of warranty, they'll help you anyway. They're fast. They're just...just wonderful. I was floored. I only wish I were getting paid to type what I am right now. But whatever.

At any rate, maybe I'll post here again, but I don't know. I don't think anyone reads this anymore. If you do, speak up! I'll give you hugs via words. Or something.

Saturday, June 6, 2009

So, let's talk about L4D2.

E3 has come and gone. Last year, I talked about how such a venue seemed mostly useless, especially because of the downfall of it's presentation. But alas, just when everyone thought it was over, the old skool E3 of yesteryear stormed back on the scene with its booth babes and "huge" announcements and expensive party atmosphere. Did it work? Well, in a word, yeah.

The coverage of E3 has been enormous, and with good reason. Hell, Nintendo actually managed to announce things people care about! But I digress from this post's title.

Valve, arguably one of the best developers in video games (up there with Blizzard, I'd say), announced Left 4 Dead 2 at this year's E3, and it pissed some people off. Did it piss me off? No, not really. I was surprised, though. Valve, like Blizzard, doesn't do fast sequels. Even the Half Life 2 episodes took longer than a year between each release, and those certainly aren't to be considered "sequels". At least not in a full-form, anyway.

Valve's forums erupted with anger from gamers claiming that Valve has betrayed them, with players accusing Valve of undercutting them by undercutting the original L4D. Valve told the gaming community that they would be supporting and updating L4D for a long, long time, much like Team Fortress 2, which, through constant free content-adding updates, has proven that such a strategy would pay off quite nicely. So, why change it? The answer is probably something like "because they can." L4D generated a good deal of money for Valve, so why wouldn't they want to produce a sequel, which in turn would generate more money?

Oh, right, that whole "promise" thing.

Let's look at how L4D has done with it's content-adding-ness:

1 Major update.

...And that's it. Hrm. Well, surely that update contained a variable shitload of content, yes?

Added 2 Versus maps and Survival Mode.

..oh. Well, gosh, that kind of does suck, doesn't it? Unless those Versus maps were new experiences...

The Versus Maps are "Dead Air" and "Death Toll".

...Wait, the maps that should have been in Versus when the game was released? God damnit! Survival Mode better be awesome..

Survival Mode puts players in the various "crescendo" moments that the normal maps contain, only the zombies do not stop coming until everyone dies. Difficulty increases the longer you are able to stay alive, with games rarely lasting longer than 10 minutes. Also, there is a new map for this mode, the "Lighthouse."

...But the Lighthouse map isn't a full, new map?


And isn't that kind of gameplay already in Gears of War 2 and Call of Duty: World at War (with Nazi zombies for some reason)?


So, let me get this straight...we got two maps we technically already had (and arguably should have had from the start), and then a new mode that's technically in other games and is just an extension of moments in the game already. That, and there's the fact that the new mode will always end in failure, no matter how great a team of players you have with you. So, if one were to be an over-the-top critic/douchebag, one could say that the only "new" content that was given is everlasting failure, and Valve calls this an update?


Cool, I'm going to stop conversing with myself now.

Granted, I don't completely stand behind everything I just wrote there, but I can (obviously) see where people would get upset. Valve has always been considered a beacon of hope when it comes to post-release DLC, showing that if you do something well and make it free, people will continue to buy your game well after it's release because of the free goodies. And so we come back to the question "why are they changing this?"

I think that L4D might have been a sort of experiment. Valve had this great idea for a game, but it had never been done before. And though they could bank on selling the game based sheerly on the Valve brand, I don't think they wanted to (and it probably would have been stupid to do so). So, they made this game, and put pretty much what they wanted to put into it, and what they did release was an extremely polished starter for what they ultimately would have liked to do in the first place, which is L4D2. Would people be able to deal with a purely co-op experience on the internet, land of overwhelming asshats? Would players enjoy doing essentially the same things over and over again, only with a few minor differences in enemy and item placement? Would the AI Director actually be decent, or a pile of crap? Well, they now have their answers.

But does this mean that they couldn't simply update L4D with improvements that would reflect their newly acquired knowledge? Surely they could add new maps, characters, content, etc. without a full-on sequel. After all, they did say that's pretty much what they'd be doing.

Honestly, I don't know, because I'm not Valve. But they're walking on thin ice. They have, however, asked us to trust them, and I think they might have earned a little trust from us. They have from me, anyway. Besides, there are many ways that they can approach this whole sudo-fiasco with intelligence. There's no reason why they couldn't combine the games a-la-Rock Band 2, yes? Why not allow all L4D content to be accessible from within L4D2? That way, if they still are going to be releasing additional content for L4D, those who have the sequel can enjoy everything at the same time, seamlessly. Just something to think about.

But, taking a look at what's promised within L4D2 (entirely new location, new survivors, new weapons, new zombies, new special zombies, new crescendo moments...) I'd say yeah, there's enough there for a sequel. Maybe not a $50 sequel, but at least a $30 kind of thing that merges with the original content. But hey, you know people will buy it anyway, regardless of what it really is or how it's really done. So maybe all this hub-bub is pointless.

But maybe it isn't.

Tuesday, May 19, 2009

Many things coming soon.

I have entered the realm of motivation via a bet yet again, and thus must complete 4 games by July 11th, 2009. Therefore, I will actually be writing things here again! Horray!

Stay tuned...

Sunday, April 19, 2009

Okay, I need some motivation.

Games I have started, but not finished:

Killzone 2
Resistance 2
Valkyria Chronicles
Final Fantasy X
Call of Duty: World at War
Ratchet and Clank Future: The Quest for Booty
Bionic Commando: Rearmed
(stuck on the last level)
Megaman 9 (stuck on the last boss)
Dead Space

So, anyone have any ideas on how to tackle this problem?