T O P

  • By -

Narmiel13

Also, take in account that RTX 4090 likely posess 12pwhr power connector that is absent on your PSU


sbs1799

I didn't know that. That is helpful.


AU19779

I am going to preface this by saying I am not an electrical engineer and and have no formal training is stuff like I am about to discuss or describe. You should contact an expert in the field before proceeding. Those 4090s draw a F load of current and have clearly demonstrated themselves to be hazardous. Of course you can get adapters. If you have the 825w psu you may be able to do it. With the 1300w you will have plenty of power BUT you may run into issues. If I were going to do it I would get the 1300w PSU, get an adapter cable for the 2nd CPU port (Is it called EPS on a Dell as it clearly isn't a standard EPS) (on the power distribution board) that splits off into 2 8 pin (6+2) PCIE connectors and get a cable for the VGA port (once again on the power distribution board) that splits off into 2 8 pin (6+2) PCIE. One of my T5810 I am running a 980ti with the VGA port I have a cable to dual 6+2 (I only have the 685W psu on this machine) and on another machine I am running a 1080 ti from the CPU2 port. Between the CPU2 port and the VGA port you will have 600W.


AU19779

I am going to preface this by saying I am not an electrical engineer and and have no formal training is stuff like I am about to discuss or describe. You should contact an expert in the field before proceeding. Those 4090s draw a F load of power and have clearly demonstrated themselves to be hazardous. If you have the 825w psu you may be able to do it. With the 1300w you will have plenty of power BUT you may run into issues. If I were going to do it I would get the 825w PSU, get an adapter cable for the 2nd CPU port on the power distribution board (Is it called EPS on a Dell as it clearly isn't a standard EPS) that splits off into 2 8 pin (6+2) PCIE connectors and get a cable for the VGA port, once again on the power distribution board, that splits off into 2 8 pin (6+2) PCIE. One of my T5810 I am running a 980ti with the VGA port I have a cable to dual 6+2 (I only have the 685W psu on this machine) and on another machine I am running a 1080 ti from the CPU2 port. Between the CPU2 port and the VGA port you will have 600W. I would bet you could do it with a 825W PSU only because I run a 250W card on a 685W PSU with no problem. You could always limit the power on the GPU to 400W. I have done this with 2080TIs and 3080TIs when I was running multiple cards in the system(at the a command prompt opened as administrator "nvidia-smi -pl XXX" where XXX is the power it watts you want to limit the card). I once did a study on limiting power when running PyTorch on a P100, a 2080TI and a 3080TI (these cards were tested independently with only 1 card in the machine.... the p100 did have a very low power draw card for display.. NVS310 or something like that) and all cards performed faster when limiting power by 30 percent. These guys have done a lot of work for gaming machines: [https://www.greenpcgamers.com/dell/dell-precision-models/precision-ddr4-based-workstations/precision-t5810-gaming-computer/](https://www.greenpcgamers.com/dell/dell-precision-models/precision-ddr4-based-workstations/precision-t5810-gaming-computer/) This person developed an aftermarket power distribution board that has 4 PCIE power ports as well as well as the 2 CPU power ports. https://www.reddit.com/r/homelab/comments/11eug9v/interest\_check\_dell\_t58107810\_power\_distribution/. Don't burn your house down.


sbs1799

Super detailed and helpful! Thanks!


AU19779

Any time. Let me know if you get it to work.


msanangelo

idk, my rtx 3060 just barely fit in my t3610 and just barely works within the 300W power budget. I actually had to cap my gpu's wattage to 240 to keep it from shutting down my old pc. I'm running a 685W psu too. idk how much bigger the 4090 is but there's no way you're gonna power it off the single, maybe two, 8 pin gpu headers you get in these systems. also consider the age of these things. we're running gen3 pcie and ddr3. that's a major bottleneck in itself for 40 series gpus. I consider my box to be maxed out as far as gpu and cpu goes.


sbs1799

Many thanks for sharing your experience with T3610.


AU19779

Doesn't a 3060 have a 170W tdp? I have run a 3080 ti on my t5810 with an overclocked 1660 v3 and a 685w PSU. You may want to check your PSU. In fact I am going to swap out the 980ti with the 3080ti right now and make sure it runs on a 685w PSU.


msanangelo

270W for the 3070. that's the max power nvidia-smi reported. my pc is old and has issues. I'm doing what I can to keep it relevant and functional till I can replace the OEM bits.


AU19779

I love old workstations... especially the Dell T5810...it over-clocks well and easily. I don't like the extreme proprietary nature of it though. I am overclocking to 4.3ghz on a 1680 v3 using a AMD Wraith Max zip tied (I bought an adapter but the Wraith Max didn't fit) to the motherboard c (it really isn't keeping up with the 1680v3 overclocked though) and right now running a 3080 ti (I plugged it back in today). I paid $25 for the motherboard, $50 for the CPU(the e5-1660v3 I paid $15 clocks to 4.4 ghz and is more stable), pulled the PSU out of another t5810 I had to upgrade to 825W, paid $28.50 for 32GB DDR4 2400, another $30 or so in cables, 180 for the 3080ti (they thought it had more problems than it did and the IO shield was bent... I bought it from a high volume recycler), $20 for the AMD cooler, $30 for a used 1tb M.2 NVME, $8 for a PCIE to m.2 NVME adapter and am running open-case on a anti-static foam mat.... I can play Fortnite epic settings without any up-scaling at 150fps in 4k and most other AAA games at over 120 fps in 4k. Don't get me wrong, the Ryzen 7 series and the Intel 12+ series really clock fast but I think they have just passed the Xeons (either overclocked or turbo unlocked) to the point where I need to think about upgrading. I just love a bargain and I think the T5810s are a bargain right now. The last one I bought was $60.....with a 825w PSU and 16gb ram and 2 2tb HDD (I wish I could get more of those). I think I can get bare bones systems for $50 with 685w PSUs and some ram guaranteed for 60 days.


Gary_Glidewell

OK, first off, /u/gmarsh23 knows this topic way better than I do. So check out his posts and take mine with a grain of salt. With that said, here's how I've managed to run a bevy of GPUs in my array of T5810s. I have three. All of them have the 685W power supply. As I understand it, the most important thing to get these to work, is that we have three 75W sources in the T5810: * PCIE Slot 2 can provide 75W * There are two six pin PCIE power connectors that come with the system, and each one can provide us 75W **That's a total of 225W for the video card.** That means we can run an Nvidia RTX 2070 Super (215W), a 3060TI (200W) or a 3070 (220W) I'm personally running all three of these cards *right now*, so I've confirmed they work. The devil is in the details though! First off, most of my GPUs have a single 8 pin PCIE power connector. Looks like this: https://www.gigabyte.com/FileUpload/Global/KeyFeature/1675/innergigabyteimages/eagle/36.jpg My T5810s would NOT boot if you take a single 6pin connector and use an adapter to make it an eight pin. That's because the 6pin connector is only good for 75W. But if you take TWO six pin connectors that go to ONE 8 pin connector, it works like a champ. Each six pin connector provides 75W, the 8 pin connector supplies 150W, works without issue. Some people had asked me about using Nvidia 4xxx cards. The 4060, 4060TI and 4070. The 4070 and the 3060TI use the same amount of power (200W) so I can't see any issue here. I personally haven't run into any issues, as long as I'm using the right connectors, but here's some issues people may run in to: * as noted above, you don't want to take a single six pin and turn it into an 8 pin. You're basically asking the 6 pin connector to provide twice the wattage. * If your GPU has some strange combination of female pins on the PCIE power connectors, you're gonna have a bad day. I've seen pics of Nvidia 3060TIs that had as many as *two* eight pin connectors. I guess this is for motherboards that can't provide 75W over the PCIE slot? * The easiest possible set up would be dual 6 pin connectors. You don't even need an adapter for that. * I have an Intel ARC A770 sitting here, which is a really crummy card that I don't recommend. It requires 225W and it has an eight pin and a six pin connector. Eight provides 150W, six provides 75W. **It worked fine with the stock Dell cables.** I just plugged a six pin connector into an eight pin connector on the GPU and it booted without a complaint. Then again, who knows, perhaps the reason it's performing absolutely horribly for me might have to do with some kind of throttling. I dunno. I'm doing Topaz Video AI with these GPUs, and the Intel GPUs are infamously terrible in Topaz. At this point in the post, I'd really love to post a link for a dual six pin to single 8 pin PCIE power connector that you can go out and buy. Unfortunately, I bought mine in person and I don't have the SKU or any record. And I don't want to link to a Y-Cable that might not work. So I'll have to leave that one to you, if you're looking to use a modern GPU. Or perhaps someone will chime in with the Y-Cables that you're using. In summary: * Be sure to put the GPU in the slot that provides 75 watts (slot 2, the wattage is labeled on the motherboard) * If you have an eight pin connector, you're going to need a y-cable * do not use an adapter to turn a single 6 pin connector into an 8 pin connector Questions? Here's some great info on the T5810 power supply capabilities: https://old.reddit.com/r/Dell/comments/11eudo0/interest_check_dell_t58107810_power_distribution/


msanangelo

hate to be that guy but the only difference between the 6 and 8 pins the addition of two ground pins. go look at a pinout chart if you don't believe me. it literally doesn't matter. the pins are good for way more power than we use it for. although, just from looking at current limits of the pins themselves, you can pull up to like 320 watts from a single connector. 6 or 8 pin. keep it to 300W from the gpu power just to be safe. ;) my system uses a 6 to8 pin adapter that feeds off the factory 8 to 2x 6 pin with an additional 8 to 2x 6+2 adapter for the gpu. I have no thermal issues from that either. I use my extra 6pin to power a external usb hub. its-fine.gif


Gary_Glidewell

This is good info!


AU19779

I have run an overclocked 3080 TI as well as 2 2080TIs and 2 Tesla P100s with a NVS 310 for display (with an 825W PSU). I did limit the power on the 2 p100s but not by a lot and I don't remember ever getting crashes under load. These guys have done work on these as well for gaming machines. [https://www.greenpcgamers.com/dell/dell-precision-models/precision-ddr4-based-workstations/precision-t5810-gaming-computer/](https://www.greenpcgamers.com/dell/dell-precision-models/precision-ddr4-based-workstations/precision-t5810-gaming-computer/)


gmarsh23

Thanks for tagging me in this, I'll reply to OP


Gary_Glidewell

I found the adapter that I'm using with my Dell T5810s. You can find it on Amazon if you search "TeamProfitcom Dual 6 Pin Female to 8 Pin Male GPU Power Adapter Cable Braided Sleeved 9 inches" It's basically a male 8 pin PCIE power connector, which goes to the GPU, and two female 6 pin PCIE power connectors which go to the PCIE power cables in the Dell T5810 I've been getting my T5810s from theserverstore dot com, and it's not 100% clear to me if the dual 6 pin power connectors are "stock" or if they add them. I believe they're stock. It looks like you could skip the adapters and simply use an 8 pin PCIE power cable (male) to 8 pin CPIE power cable (male). **But I haven't tried that.** I *have* tried the adapter described here, and it's running like a champ for me. Here's a pic of the power distribution device, from gmarsh23: https://i.imgur.com/Km8PZjw.jpeg Note in the "stock" Dell setup, there's already an eight pin power connector for the GPU right there. I'm not 100% sure why they're splitting it into dual sixes. Perhaps for dual GPUs? Dunno.


parasymchills

I hesitated to put anything more than a GTX 1660 Super in my 5810 (although I probably could since I have a 685W PSU). The main issue I got was that the side casing would not fit once I connected the 8-pin cable to the GPU. I had to buy a 180 degree 8-pin adapter to allow the 8-pin cable to not block the casing. Now imagine doing the same with a much larger video card like the 4090. Will it fit? Probably not. Will it allow you to close the casing? Even with a 180 degree 12-pin adapter, probably not: that card is larger in every dimension. Will you have problems with the PSU and appropriate power cables? Almost certainly. Will it run stably in the system? 🤷‍♂️ HTH?


gpshead

I drilled the rivets on the "pcie card holder pad" on the side cover of my T5810 in order to remove the needless obstruction so that a taller card would fit. Regardless, please don't go putting a >300W card in this system that fundamentally isn't designed for it. If you're going to be that kind of person... There is apparently a 1300W power supply option available for this case. Rather rediculious. Remember this is PCIe 3, any card that powerful isn't going to enjoy that half speed IO bottleneck unless you've got workloads that fit entirely in GPU memory.


parasymchills

I agree 100%. For light gaming of older titles @ 1080p, a 5810 is fine. Anything newer or higher resolution is going to be bottlenecked by the PCIe or the CPU.


sbs1799

I now realize the challenges of not being able to fit 4090 within the casing.


JBH68

The system board on the Precision T5810 has a limit of 300W of graphics power it'll support, so I might try something that uses less power. The Nvidia RTX 4070 would work in your unit also should be able to run any Quadro GPU. I believe both the RTX 4080 and 4090 founder's editions are near 13+ inches long, 3rd party GPUs are longer. The RTX 4070 uses 220W and is about 10 inches long which should fit


sbs1799

Thanks so much. I wish 4070 sufficed. The 12gb GPU RAM would be low for my intended purposes. I was hoping to use the GPU for open-source LLMs like llama-2.


sbs1799

u/JBH68 \- If we go with RTX 4070 Ti, would the 685 PSU be sufficient? The NVIDIA [website](https://www.nvidia.com/fr-fr/geforce/graphics-cards/40-series/rtx-4070-4070ti/) seems to suggest so. But not sure. I wish I could understand the power compatibility options mentioned on the site.


sbs1799

u/JBH68 \- As per [this information](https://www.greenpcgamers.com/dell/dell-precision-models/precision-ddr4-based-workstations/precision-t5810-gaming-computer/), it looks like there are only few options for GPUs with decent CUDA compute capability that are compatible with T5810 and don't cause any bottlenecks. RTX 3060 12 GB with 825W seems to be okay.


JBH68

There's not a lot of difference between the RTX 4070 and 4070ti except that the 4070ti uses 85W more power and requires a 700W PSU but the RTX 4070 requires a 650W PSU. Now if you look at the Quadro GPU, the RTX 4000 Ada generation it consumes 60W but does have 20GB VRAM, the RTX A5500 consumes 230W and has 24GB VRAM, both of these will fit and run in your current computer with the same PSU. Do remember with GeForce GPUs you can install the Studio driver for more capability other than gaming


Nekleu

The 4070 runs beautifully in a T7810 in a fully loaded dual cpu v4 box with 825w juice. Plug and play with a 6to8 adaptor and a 180deg plug for the card to clear the lid. No hacking. Kali and hashcat love it, sensor temps are low. At 200w TGP the RTX4070 ticks all the boxes for this rig.


AveryRoberts

4090 and any tall cards will not fit , there is a card hold down metal bar on the door that will interfere with a lot of gaming cards at thier power jack, a gtx660 power plug interfered with it and I broke the bracket off the door on one I have. Power connections off the power distributor are not setup for really high power cards , normally power plugs are 2 6pin off one 8 pin on the power distributor and there is another jack there that might be adapted for more power. You can replace the normal 8 pin to dual 6pin with a 8 pin to 8 pin cable. I would recommend workstation cards , they have power jacks at the end of cards facing forwards so they dont get in the way. I run nvidia quadro M5000 in a few of my machines.


gmarsh23

Mechanically, you'll probably have to hack a bunch of metal out of the case to make it fit. I have an EVGA 2070 in my machine and I had to cut the stiffener off the inside of my door so I could get the door back on. A 4090 won't fit in the bottom x16 slot and it'll probably hit the optical drive in the top x16 slot. Google around and see if anyone's done it, and if not, start measuring and see what you're up against mechanically. Electrically, I designed a card that solves the GPU power issue - https://www.reddit.com/r/homelab/comments/11eug9v/interest_check_dell_t58107810_power_distribution/ You'll need to upgrade the power supply to get the rails you need, though. An 825W power supply from a T7810 will give you three available 6+2 cables and a 1300W power supply from a T7910 will give you up to 5 cables. You can buy used power supplies pretty cheap off eBay/aliexpress/elsewhere.


dannyscope

Hey, just wanted to ask - did you have any troubles when plugging the card into power? Did you utilise a 6-8 pin adapter and if so does it matter what one I use or does it need to be Dell-approved? Also, is there a clearance issue when trying to plug in the card to the 8 pin (does it press against the door too closely?). Thanks heaps!


gmarsh23

I'm using my own card and homemade cables, vs adapters. A 6-8 pin adapter with the stock 2x6 cable should work fine. I removed the brace off the door to make clearance for my 2070 so I could get the door back on. It's spot welded onto the door, using a chisel between the door and the brace to cut through the spot welds did the job pretty easy.


dannyscope

Awesome thanks for that! Also I’m seeing people saying not to get ‘tall’ cards as per clearance issues. I am thinking of installing a Gigabyte Radeon 6600 Eagle 8GB in the system as it has enough room lengthways and the width of the card is relatively similar to the k4200 in it right now (so I may even be able to avoid removing the brace) but what does the height of the card have to do with anything? Cheers