Over the years we've had a number of people ask for transcripts of some of the shows to make them more searchable. Well, we're experimenting with the folks at PodsInPrint to have them do transcribing of as many TWiT.tv epsisodes as possible. They can turn it around in less than 24 hours. Here's their sample of TWiT182. I'm still hoping to have a template for these on http://wiki.twit.tv but there's no template for them at this time. What do you think?
Kryten of Red Dwarf, aka Robert Llewellyn, appeared on MacBreak Weekly last week. He spoke about the upcoming Red Dwarf reunion and his new podcast "Carpool" found on his LlewTube Channel. Here's a bit of last week's conversation:
Llewellyn visited the TWiT Cottage last summer to chat with Leo. Here's a clip someone picked off the Live feed:
Since we went all out when building the UGM, I spent a lot of time building and perfecting the system. Because of this, I would like to share some tips with you for building your own gaming machine, without most of the pain and headaches I had to go through. Essentially, I want you to get the most system, for the least amount of time. Here are some things that have a high payoff, with little headache.
Setting up SLI
SLI, or Crossfire for AMD graphics cards, has to be the single most effective way to boost the performance of your gaming computer. These technologies allow you to combine multiple gpus and make them work together for your benefit. Combining multiple processors is a bit tricky when working with CPUs, as you know from the disappointing scaling we have been seeing in multi-core CPUs. GPUs on the other hand, scale much better, since it is somewhat easier to break up 3d graphics. For example, you can have a card render the bottom of the screen, and another render the top, you can have an individual GPUrender every other frame, or you can do scan line interlacing. There are of course, as with all things in life, diminishing returns. Four GPUs dont scale as well as three, and three do not scale as well as two. There is a new chip coming out... supposedly... called the Hydra that claims to be able to allow GPUs to scale far better than they current do, but lets pretend it is vaporware until we finally see it for sale.
Making this actually work, is VERY simple. Lets say you are using NVIDIA cards like the UGM, stick two, or three cards into their slots on their compatible motherboard. Then, place the SLI bridge across the two, this is a connector that allows the cards to talk to eachother. SLI with slower cards doesnt need a bridge, and high end cards can often work without it, but if you can, use one, it will greatly increase performance over using the system bus to transfer all of the data. Anyway, install the cards, install the bridge, install the drivers into windows, and your ready to engage SLI. Right click on your desktop, and go to "NVIDIA Control Panel." Next, click on Set SLI and PhysX configuration. All you have to do is click on Enable Sli, and hit apply. Your screen will do some jiggly stuff and go black for a moment, then it will ask you if your happy with the new settings, and you say yes.
Thats it. Your computer is now twice as awesome at gaming, or more.
Setting up a RAID - 0 array
For the UGM motherboard we used an ASUS Striker II Extreme motherboard. The chipset for this board is an NVIDIA 790i model. All you need to do to build your RAID - 0 array is to start the system, go into the BIOS, enable the RAID controller on the motherboard, enable raid on the hard drives you want to use, then save and restart. When booting now after you see the BIOS screen, you will see a RAID screen flash up before the Windows boot screen. Hit the configuration key that it gives you access to the configuration screen. Select the drives you want, select "stripped" for the array, as that stands for RAID 0.
Hit accept, and you are done. You now have a really fast storage array.
Ways to boost performance that will probably give you a headache the first time you try it.
Case Modding to fit radiators.
I customized the UGM case by cutting and drilling a space into the top of it for the radiator to suck air from an mount to. Let me be clear about this, I live in a warehouse that has fully metal shop. Ask yourself before you get yourself into this and order the parts... "do I have the correct tools for this job? Do I know what I am donig?" If not, then stop right there and just buy yourself a case that is pre-made for water cooling if thats your plan, or has good enough airflow for your application.
Danger Den is one company that sells a number of cases for water cooling. Buy yourself parts that fit together rather than making your own, and take your time on your finishing touches.
Water cooling multiple graphics cards
Since there are three graphics cards in the UGM, there is very little room between them is very limited. The barbs that came from BFG were made by danger den, and they were so small that they drove me insane not getting them to leak. I replaced them with EK barbs that helped me keep the setup from leaking, but some other problems still exist. If you are trying to put the hose onto the two barbs, you have to pull the cards apart... something that you cannot do when they are in their slots. You have to therefore remove all three of the cards, guestimate the length of the tube you need, put the cards together before hand, and then try to put them all into their slots. Remember when removing these cards from their sockets, they have locks, so you have to somehow contort your finger into unlocking multiple slots at once.
You will tear your hair out over this. The EK barbs really help, but its very frustrating.
Solutions? Air cooling! Seriously guys, since the drivers for your GPU only let you overclock all three to the same speed, you can only go as fast as the slowest card... so just air cool them. Secretly NVIDIA has the ability to clock them independently, but they don't let you, and you did not hear that from me. Anyway, the heat sinks on the cards you can buy these days are so massive, and effective, that you really are not going to see tha tmuch of an improvement out of an overclocked triple SLI setup on water, rather than air. Water cool your CPU, or one GPU if you want, but lets just leave the triple GPUs on air mmkay?
Overclocking is really very simple in concept. You essentially fiddle with your voltages and clock speeds until your computer starts to become unstable under high load, and then you drop it back a few notches. Sounds simple enough, yes? In practice however, you will spend hours upon hours running stress tests, blue screening, resetting, and starting over.
Instead, just do a safe overclock, and be happy. Its when you are trying to get that last few .2 ghz out of your processor that you are going to get yourself into trouble. Just go online, and google the type of processor that you he, and go a few hundred mhz lower than what most of the overclocking forums recommend. For example I have an Intel q6600 in my desktop, which is a quad core 2.4 ghz processor. Most people online are able to get it up to 3.2 or even 3.6 ghz on water, but I run mine at 3.0 ghz on water. Why? Because it is really not worth my time dealing in the voltages for hours on end.
Don't get greedy.
It's a bummer that Sarah Lane doesn't do Pop Siren anymore. Here's a great clip of Episode 1, which also features a segment of another Friend of TWiT, Dr. Kiki Sanford, starting a fire. You can see Sarah each Friday 3 p.m. Pacific on TWiT Live, doing a show called "This Week in Fun" with fellow TechTV/Revision3 alum Martin Sargent. Here's a listing of other TWiT Live shows
The most obvious transition in technology that you see evident in this UGM, is the growth in the importance of GPU hardware. Early UGMs had one GPU, then you would find two gpus, and now with this UGM, three GPUs. I would say that the single most important components for the UGMs mind bending processing power, is its set of triple water cooled BFG GTX 280 graphics cards working in SLI. In the past GPUs were only only used for applications like video games and 3d CAD programs, but now as operating system GUIs require more graphics power, and as companies like NVIDIA allow devlopers to take advantage of the processing power of their chips for new uses with technologies like CUDA, I think we are seeing the GPU rise in importance to the level of the CPU, and maybe then suprassing it.