dotnet, testing comments edit

I recently had to do some performance profiler evaluation for .NET applications and I figured I’d share my results. Note that it’s as scientific as I could make a subjective review (e.g., “friendly UI” might mean something different to you than to me), but maybe it’ll help you out. Also, I’m not a “profiler expert” and, while I’ve used profilers before and understand generally what I’m looking at, this isn’t my primary job function.

The five profilers I tried out:

I put an explanation of what each feature “means” in tooltip form, so put your cursor over it if you don’t understand what I’m talking about. An “X” in the box means it has the feature.

Testing was done on a dual-2.8GHz processor machine running Windows Server 2008 R2 64-bit and 4GB RAM.

VSTS 2008

ANTS Perf 5.2

VTune 9.1

dotTrace 3.1

AQtime 6

User Interface

Visual Studio integration

X

X

X

Standalone application

X

X

X

X

Friendly/easy to use

X

X

X

Robust reporting

X

?

X

Measurement Style

Sampling

X

X

X

X

X

Instrumentation

X

X

X

X

X

Measurements Recorded

CPU time

X

X

X

X

Wall-clock time

X

X

X

X

X

Additional perf counters

X

X

Notes

This requires Visual Studio, which means you have to have VS installed on the machine running the app you’re profiling. That said, this was the easiest to get results from and the easiest to interpret.

In general this appeared to be the best balance between “robust” and “usable” but I couldn’t actually see the report that came out because it locked up the UI thread on the machine and ate 3GB of memory. I’ve asked about this in the forums. Turns out this is fixed in the next version, 6, currently in EAP.

I couldn’t actually get a profile to run using VTune since it complained of being “unable to determine the processor architecture.” As such, I don’t know how well the reporting works.

When I ran dotTrace 3.1 on a multi-proc system, I got several timings that came out with negative numbers (-1,000,289 msec?). You can fix this by setting the proc affinity for the thing you’re profiling. I tried a nightly build of dotTrace 4.0 and that’s fixed. dotTrace 4.0 will also let you profile a remote application - something the others don’t support.

AQtime has a lot of power behind it but lacks the usability of some of the other profilers. It appears that if you take the time to really tweak around on your profile project settings, you can get very specific data from an analysis run, but doing that tweaking isn’t a small feat. I spent a good hour figuring out how to profile an ASP.NET application in the VS dev server and setting it up. Also, while it very well may be due to my lack of experience with the tool, AQtime had the most noticeable impact on the application’s runtime performance. It took several minutes for the first page of my app to load in the browser.

For now, it looks like the VSTS profiler is my best bet. If I could figure out the UI problem with ANTS, or if dotTrace 4.0 was out, I’d say those options tie for my second choice. The VTune profiler seems to be the most… technical… but it also desperately needs a UI refresh and seems geared toward profiling unmanaged code, where managed code is a “nice to have” rather than a first-class feature.

UPDATE 1/21/2010: I added AQtime to the list of profilers I tried out. Also, I removed the “VS integration” checkmark from ANTS because, while it adds a menu entry to VS, all it does is start up the external application. I’m not counting that. Finally, I found out my ANTS problem is fixed in the next version, 6, currently in EAP. Since it’s not released, I still have to go with the VSTS profiler, but once it’s out, I’d vote ANTS.

gaming, playstation comments edit

GTA: Episodes from Liberty
City Over my holiday break I spent some time playing Grand Theft Auto 4: Episodes from Liberty City. I’m not quite finished with it, but I’ve gotten far enough that I have an opinion.

Let me give you a frame of reference so you understand where I’m coming from.

I’m a huge Grand Theft Auto fan. Whenever a new Grand Theft Auto comes out, I take a whole week off work and dedicate the entire week just to playing the game. I get the expensive edition with all the bells and whistles. I finished GTA4 with 100% completion. I am, as far as I can tell, the target market for Grand Theft Auto.

Given that, when I heard there were expansion packs coming for GTA4, I was really excited. That, the holiday season, and one of those frozen buckets of daiquiri and I’ve got the most awesome cheap vacation ever. They released it as separate downloads or as a self-contained disc (same price), so I got the disc (love that Xbox DRM), the strategy guide, and when vacation time came around, I was in.

There are two expansions: “The Lost and Damned” and “The Ballad of Gay Tony.” I have very different feelings on each expansion.

In “The Lost and Damned,” you play the leader of a biker gang. You go on several missions to fight rival gangs, to get money for your own gang, and to help out friends. The missions themselves are fairly standard GTA fare, but they’re fun. I was partial to the gun battles (there were several) since many were fairly large scale with lots of enemies. A couple of missions were difficult, but I never really struggled with them - I might have to run them two or three times, but I could get them. The feel of the whole thing was good and got you involved with the story. Very nice.

I did not play “The Lost and Damned” through to 100% completion. There were some race missions that were optional to the completion of the story and I’ve always absolutely despised race missions in any of the Grand Theft Auto games. Driving around the town at my own pace is fun, but the driving really isn’t the goal in my opinion; it’s a means to get you somewhere, so not having pressure when driving is key. The driving controls are pretty sloppy on most vehicles and shooting while driving takes some serious getting used to. Anyway, I didn’t do the races because I don’t like the race missions, but I suppose I could have if I really cared.

So, overall, I liked “The Lost and Damned.”

After “The Lost and Damned,” I moved on to “The Ballad of Gay Tony.” In that, you play a guy who runs some night clubs with a business partner, Gay Tony, who is always getting into trouble and you get to bail him out. I’m not done with it yet, but this one is where I start having some fairly strong opinions.

First off, I like the characters in the expansion a little better than the ones in “The Lost and Damned.” They’re a lot more colorful and more fun, so big plus for the story. I particularly like the additional tie-ins with the characters from the original GTA4 game. There were a few tie-ins in “The Lost and Damned,” but the connections with “The Ballad of Gay Tony” are, like the characters, more fun and colorful.

I also like the addition of the “base jumping/parachuting” activities. It reminds me of Grand Theft Auto: San Andreas, which was a really fun game.

That said, I’m actually pretty irritated with “The Ballad of Gay Tony” and here’s why:

They grade you on how well you did on each mission.

Whenever you complete a mission, you get this statistics screen that pops up and tells you percentage-wise how well you did. There are some [seemingly] arbitrary criteria that you’re supposed to fulfill to get 100% completion. Of course, you’re not told what the criteria are until after you’ve finished the mission, so there’s problem number one.

The real problem is that many missions are unreasonably difficult.

As it stands, on most missions I’m having to run them three, four, five times to even complete them. While I’ve played all the Grand Theft Auto games and invest quite a lot of time in them, I’m not 14 years old anymore. I don’t have the time or inclination to run, re-run, and re-re-run missions to get 100% on them. I’m not interested in (and possibly not even capable of) developing the Nintendo-timing snap-reflexes required to fulfill the criteria you won’t tell me about until after I’ve run the mission six times and barely completed it.

Not only that, but “grading me” on how well I did is a huge distraction. I’m not in the “sandbox environment” anymore, where I can do whatever I want or solve the mission however I see fit. Now I’m more on rails, having to do things within a predefined time limit using a specific set of resources in a specific way. That totally defeats the purpose of the thing, in my opinion, and makes it feel less like I’m my own character in my own environment and more like… well, more like I’m just playing a standard platformer. If I wanted that, I’d get the standard platform game and move on.

I won’t even get into the fact that you can’t save wherever you’d like to in GTA games - you have to save between missions and you have to go all the way back to your safehouse to do it. If you’re in the middle of a really difficult mission that takes 20 minutes to finish and you get killed at the 19th minute… well, too bad. Get killed on the way back to your safehouse? Tough cookies. (They did implement an “autosave” feature in GTA4 so you at least don’t lose your progress if you complete a mission, but still.) Yeah, that’s how it’s always been, but it doesn’t make it right.

I won’t even get into some of the tedious missions they added like “Club Management,” where you’re supposed to help manage a night club and basically walk around from room to room and get forced to “scan the floor for trouble” by looking around with a thumbstick. Sometimes you even get to go run errands for people like escorting a VIP from one club to another. No, thanks.

“The Ballad of Gay Tony” has turned GTA from a game into a chore. I don’t want to compete with other peoples’ standings on how well they did on the missions. I don’t want to be pulled out of the sandbox environment. I don’t want to “replay missions” to see if I can improve my score. I liked the “pass/fail” that was going on before… but now that there’s grading and achievements attached to your “percentage complete” on the missions, I’ll never get 100% on this thing, and really, I’m not interested. And I’m not interested in tedious crap that just eats up time but isn’t actually fun.

Should that matter to Rockstar? Yeah, it should. I’m not interested in getting 100% because it’s too hard, but I am a completionist, too, so not being able to get 100% irritates me. It makes me feel like I’ve been ripped off because I bought a game I want to finish but really won’t be able to. It’s frustrating and irritating.

Yeah, you can argue that it’s all in my head and I’m not being forced to get 100%, and that folks who want that additional challenge now get it and I should just ignore it. I just have to question how many folks actually wanted that challenge and whether that group of folks is really the target market. Maybe they are and I’m alone. I sure hope not. I like my vacations in Liberty City and I hope I don’t have to find somewhere else to go.

Net result: Go for “The Lost and Damned” but skip “The Ballad of Gay Tony.”

I’d love to be able to give this feedback to Rockstar somehow but I’m not sure how. If you know, leave it in the comments for me.

media, windows comments edit

I found I was running out of space with all of my DVDs and such, even after adding an eSATA port multiplier and a few 1TB drives. I only have one drive slot left, and while at first I thought I’d fill it, I realized that doesn’t leave me much wiggle room in the event of a real emergency where I need to do some fancy drive swapping. As such, I decided to replace one of the 500GB drives with a 2TB drive. The 500GB drive I took out will stand ready as a replacement for the system drive should catastrophe strike.

I started the upgrade with about 750GB free because I wanted to be sure there was enough free space to remove the 500GB drive without losing any data.

Post upgrade, I have a total of 7.73TB in storage with 2.08TB free.

[My WHS storage screen - click to
enlarge.

Given that I’ve figured DVD images run about 6.7GB each, that gives me room for another 300 DVDs before I run out of space. Of course, when I hit a bit over 1TB free, I’ll have to consider what my upgrade options are in case I need to remove a 1TB drive to replace it with something larger.

UPDATE 1/9/2010: Turns out I got a bad drive. The first night it was in I got a bunch of errors from WNAS.SYS telling me something about “VRM temperature too high.” Doesn’t make a ton of sense, but that’s what happened. Anyway, that first night it totally disappeared from the storage pool, as if by magic. The second night I decided to re-seat it (thinking “bad connection”) and run chkdsk on all drives. Got the WNAS errors again and a bunch of disk errors, so… back it goes. Most of my drives are Western Digital and the drive I tried out was a Samsung. Being a little technology-superstitious, I’ll probably get a WD drive as a replacement.

UPDATE 1/15/2010: I put a Western Digital Caviar Green 2TB drive in and I’m back up to 7.73TB. So far I’m not seeing any of the weird WNAS.SYS errors I was seeing before which leads me to believe I had a bad drive. Every other drive in the system, save the system drive, is a WD Caviar Green drive, and I’ve had good luck with them, so I’m hoping my luck will hold.

UPDATE 1/16/2010: I see the WNAS.SYS temperature warning errors again, but it appears that so many in succession is generally understood to be some sort of bug in the driver rather than a health issue. The system didn’t restart itself or anything, so I guess I’ll just watch it. One thing I found while I was looking for the solution to the WNAS.SYS issue is this article over on the HP site that says how Samsung SpinPoint drives will suddenly “disappear” from the system and it’s a compatibility issue. As it turns out, that’s the type of drive I ordered that failed - a Samsung SpinPoint 2TB. Looks like the WNAS.SYS error and the drive failure were unrelated. I’m still watching how this WD drive behaves. I can ignore false errors in the logs (though it’s fishy that they show up when I add a 2TB drive - maybe I’m crossing some size boundary that causes the bug to show up?), but if a drive “disappears” on me, that’s trouble. I’ll probably wait a week or so before putting any additional info on the server that might make it so I can’t remove the drive.

UPDATE 6/16/2010: Be careful of using the WD Green drives. Only some model numbers appear to be good.

In dealing with today’s technology, I feel like I’m inundated with what I usually refer to as “fiddly shit”: constant, tiny maintenance tasks to make sure things are still working together correctly. No one task is a big deal; most take under five minutes to fix. Some are larger or more chronic issues that require research and troubleshooting over the course of weeks. Let me throw out some examples of recent issues:

D-Link DAP-1522 wireless access
point/bridge. Wireless networking at home. I got Verizon FiOS and the router they provide only does wireless-G networking. I wanted a faster network to accommodate my gaming and my media, so I added a wireless-N access point. This added a ton of fiddly shit to my list.

  • Access point setup and maintenance. I bought a DAP-1522 for its supposedly easy setup. Setting the thing up was not nearly as straightforward as the instructions would have you think. Even now, once I have it set up, I find sometimes that it won’t connect things at wireless-N speeds, dropping them back to wireless-G. Rebooting the access point (pulling the plug) sometimes fixes this, but also requires me to reboot anything that was previously connected to the network because for some reason things won’t just reconnect.
  • Conversion to WPA security. There is also an undocumented “feature” on the DAP-1522 that makes it such that if you use WEP security the access point will not let you have wireless-N connectivity. Everything only ever connects at wireless-G. Not documented anywhere, mind you, so some time was spent on the phone to support for this. I was able to connect at N speeds after switching to WPA security… but I have devices (like a Nintendo DS) that only support WEP, so now I have to either support two different wireless networks (WPA with wireless-N via the access point and WEP with wireless-G via the FiOS router) or just not connect the old devices. Right now: no old devices.
  • USB wireless adapters and Windows 7. I upgraded everything to Windows 7 at home and while I love the OS, the drivers don’t seem to be quite up to snuff for any of the USB wireless-N adapters I have. They work… mostly. I found that in some cases you have to install not only the driver but also the stupid “configuration utility” that the manufacturer provides and then things work, even if you don’t use that utility or even ever open it. Also, if the computer goes to sleep and wakes up, it disconnects and reconnects to the network over the course of about the first minute after you log in. It’s stable after that, but come on. Oh, and the wireless-N built-in adapter on my Dell Studio Hybrid just will not connect at N speeds, always preferring G. Still don’t know what’s up with that.

HP MediaSmart Home
Server Windows Home Server. I love my Windows Home Server, don’t get me wrong, but there are some fiddly things that crop up.

  • Random disk errors. Every two or three months I’ll get an error that says my system disk is having problems. I run the repair, everything checks out, and all is well with the world again for the next two or three months. Is it the weird disk replication thing they have going on? Is it PerfectDisk for Windows Home Server doing a disk defragmentation number on me? Disk actually going bad? Who knows.
  • More random disk errors. Since upgrading to Power Pack 3, I had a thing where every week or so the server would just stop responding to any requests at all. You ended up having to reboot the server hard and it would all come back. The lockup seemed to correspond to a scheduled task I had running on a client machine that would do a full directory listing of a really large set of files and archive the list. (My DVD library isn’t duplicated, so if a drive dies and I lose files, at least I’ll know what to re-rip.) Error log looked like it just stopped communicating with the eSATA port multiplier. I found some upgraded drivers and we’ll see how that goes.

Media sharing. I’ve got my media center solution that I’m working on and one of the biggest challenges is figuring out what format the media should be in. DLNA spec supports some formats, Windows Home Server supports some formats, Windows Media Center supports some formats… but which is the right one for me? I’m lucky to have found something like Asset UPnP that will transcode audio formats into something more universal, but that’s just audio. What about video?

Video editing. I got a Creative Vado HD for Christmas. I like the recording capability but the little editor that comes with it is severely lacking. If you don’t want to use that editor, at least on Windows, you’re into something like Sony Vegas. But if you want to edit the videos the Vado records, you have to figure out that there’s another codec you have to install.

My point on all this is that I’m a geek and I have the knowledge and skills to at least know where to start to figure this sort of thing out. What do the non-geeks do? Do they just give up and use everything out of the box and accept the crappy limitations of the products and complain they don’t work? Do they get a geek friend/family member to just continually come fix it?

I can see the appeal of things like the homogenous environment. If you just give in and accept the box in which a specific brand (Apple, Sony, whatever) places you, everything pretty much works together. If they don’t have a product that does what you want, well, it’s just “not possible” right now and you wait.

As I get older, I won’t lie - this sort of thing appeals to me. I’m tired of tweaking and fixing and fighting the fiddly shit that is inherent with getting all this to work together. I don’t mind investing time in getting things set up and configured appropriately as long as I don’t have to keep adjusting and reconfiguring and troubleshooting. I just want it to work. It should. Why doesn’t it?

GeekSpeak comments edit

I have some files (like my local Subversion repository, some documents, etc.) that I need to sync between computers and I was recommended Dropbox as the way to get that done. I signed up, installed it, and it works brilliantly.

That said, my primary complaint is that it only synchronizes files inside a special “My Dropbox” folder that it creates. Anything you want to synchronize has to live in there. Thing is, while I don’t mind changing the location of some things, like my documents, I really would rather not change the location of other things, like my local Subversion repository. I like it in “C:\LocalSVN” rather than “C:\Documents and Settings\tillig\My Documents\My Dropbox\LocalSVN” or whatever.

Turns out you can use the magic of symbolic links to fix that right up. If you create a symbolic link (junction point) inside “My Dropbox” to content that actually lives outside “My Dropbox” then the content gets synchronized just fine but can live wherever you want.

If you are in Windows XP, you’ll need to go get a free copy of Junction and put it somewhere in your path like your C:\Windows\System32 folder. In Windows Vista or Windows 7, you’ll use the built-in “mklink” command.

  1. Get Dropbox set up and synchronizing on your computer without the external content.
  2. Open a command prompt as an administrator on your machine.
  3. Change to the “My Dropbox” folder that you set up. In Vista or Windows 7 it’ll be like: cd "\Users\yourusername\Documents\My Dropbox" In Windows XP it’ll be like: cd "\Documents and Settings\yourusername\My Documents\My Dropbox"
  4. Create a directory link to the folder containing the external content. In Vista or Windows 7 it’ll be like: mklink /d "External Content" "C:\Path\To\External Content" In Windows XP it’ll be like: junction "External Content" "C:\Path\To\External Content"

That’s it. Dropbox will see the symbolic directory link as a new folder with content it needs to synchronize and it’ll get done.

Note that you can do things the other way around, too - move the content into the “My Dropbox” folder and then create the symbolic link from the original location into the moved content… but this way it means you don’t have to do the moving to begin with. Admittedly, I kinda wish I had figured this out before I moved everything, but now I know.