Search the Community
Showing results for tags 'blender'.
-
Ok i seem to be having a problem. I am learning 3d modeling and after trying programs like 3ds max but not being able to find any good tutorials for it ive moved to blender. but my computer seems to be unable to handle it. i lag while using the interfaces, while rendering and just while doing anything... i dont understand why as i have a high end computer that should be able to handle it more than enough and i have had task manager opened while working and my cpu and ram usage never goes above around 30% and its still lagging like mental.. any ideas?? specs: gtx 1080 i7 6700k (overclocked to 4.5ghz) 16gb ram any other specs shouldn't matter but if you want to know just ask.
-
Hi, I just want to kick-off a modding/build project worklog here, which will kind of be more than just a worklog. My colleague is a 3D artist, he spends a lot of time working with Blender. The time came when he could invest money in a serious work station with 3x 1080 Ti`s and I will be the person helping him carry out this project. The main point of the project is to compare performance results of 3x 1080 Ti Founders Edition cards sandwiched to each other in a hot summer day, versus a fully liquid cooled solution. It's no secret that he boost frequency of the 1080 depends on the effectiveness of the stock blower, and liquid cooling them would even grant some overclocking headroom. Besides having some fun and building a liquid cooled PC, I hope that this thread will reach people who are new to building render stations, among which I belong as well. This project is supported by: Hex Gear Cablemod G.Skill EKWB and the owner's personal wallet ofc. List of parts that are being used is: Processor: Intel® Core™ i7-6850K Motherboard: SUPERMICRO C7X99-OCE Graphics Card: 3x Gainward GTX 1080 Ti FE RAM: G.SKILL® TridentZ F4-3200C16Q-32GTZSW PC Case: Hex Gear R80 Power Supply: Silverstone® ST1500-GS I personally feel like the choice of hardware was not the best, referring mostly to the motherboard choice and the PSU as well... but guess that's how it goes when you are doing something new, mistakes occur. I would like if we could debate on the subject of motherboard and CPU choice before i start posting some actual modding work being done.
-
Hello, I just want to ask those with A10 Apu's if blender cycles works?
-
Anybody tried this out? Here is my result https://hardforum.com/threads/amd-ryzen-blender-benchmark-scores-thread.1919896/
-
performance [COMPARE] AMD "Ryzen" Blender Test
Husky posted a topic in CPUs, Motherboards, and Memory
Hi, The purpose of this thread is to compare different CPUs to AMD's upcoming "Ryzen" by posting how long it takes for our systems to render the Blender test file on AMD's website. I'll start with my time (6700k @4.6): 1min:6sec AMD RYZEN TIME: 36sec (Mods: If I did something wrong, feel free to lock and/or remove this thread.) -
does blender utilize gpus more or cpus more? from what i heard is that some 3d rendering programs utilize the cpu more in some cases and gpu more in some cases depending on the program. for example octane rendering utilizes gpu more most of the case, it also states on their website that it is a gpu based program. blender does not state if it is a cpu or gpu or both based program
-
Hi. I have a question about Blender Software. My question is if i need WinRAR to download and edit in a toturial, im going to make for my friend. Maybe you got some suggestions, where to download some templates from. Thanks in advance.
- 2 replies
-
- blender
- blender intro help
-
(and 1 more)
Tagged with:
-
I was thinking about getting a graphics tablet for 3D sculpting specially in zbrush. I've been learning Blender for several month, but now trying to learn likeness sculpting in zbrush but most of the people are saying "get a graphics tablet" I didn't know much about graphics tablet, but in my budget i chose some tablet, (keep in mind the price is almost double in my country) 1. One by WACOM – Medium (8.5 x 5.3 in 2k pressure point) 2. Wacom Intuos S (6.0 x 3.7 in 4k pressure point) 3. HUION H640P (6.3 x 3.9inch 8k pressure point) Soooo which one should i get? i really liked the one by wacom medium since it's kinda bigger then other those, but is has only 2k o pressure point. and my question is is 2k pressure enough? or should i get that 4/8k pressures tablet? and since i'm using 22inch 1080p monitor 6.0 x 3.7inch is enough? or that 8.5 x ,,,,,, is good for me ? maybe don't suggest other than these two company,
- 8 replies
-
- graphics tablet
- wacom
- (and 4 more)
-
Hey guys, I'm kinda frustrated with Blender 2.79. How do I apply a Quick Smoke effect? I created a flow object, added quiick smoke and baked it. The question is, how do I save the smoke at that point foreveeeer. I don't wanna handle the smoke effect every time I restart the project. As soon as I apply the smoke effect modifier, the smoke disappears. Help, please. Helge Screenshot http://prntscr.com/nw99hk
-
Hi there, I do a lot of video rendering using my Ryzen 5 2400G in Blender. My build also has a dedicated GTX 750 Ti in which my display is plugged in. I am curious to know if I can ask Blender to use the integrated Vega Graphics to accelerate rendering while I use the 750 ti for normal use, or can I plug in my display to the Vega Graphics and ask Blender to use CUDA acceleration for rendering. I need answers for both of the above cases. Thanks in advance.
- 2 replies
-
- ryzen
- integrated graphics
-
(and 3 more)
Tagged with:
-
Hello all, I am getting ready to build a computer for a cousin of mine. He is not a "gamer". He likes to work with Blender to create animations and such. The You Tube videos and other sources show mainly benchmarks for games and very little on software like Blender. I would like to know either some place that has benchmarks for Blender such as render time on multi core CPU's , and Graphics cards using either CUDA's or OpenCL. I am further confused now that AMD has released new processors and the new graphics cards. Any help you can give will be greatly appreciated. Thank you in advance and have a Blessed Day.
-
Hey all, just curious about rendering with Blender on Tesla video cards. What would the performance be comparable to, and have you tried it? The reason I'm looking is mainly high VRAM amounts, and lower price compared to Quadro cards. Thanks in advance for your input!
-
Hi! We're shopping for a laptop that can work well with Blender, some gaming, music and other media at same time. My budget is $800, I prefer having the following features: Numbpad, 1 TB HDD (add'l SSD is a plus), multiple USB ports, no-touch screen. I would also like an optical drive but have been unable to find one at this price point. Any suggestions would be sincerely appreciated!
-
Hey all, I use a Ryzen First Gen 1700 8c cpu with 3x GTX 1070 for blender rendering. I would like to know if there is a solution using old server parts that would give me fairly high 3.5ghz speed but with MANY Cores.. Ideally I would love a Threadripper 32c. but costs too much. cheap is the idea here!! If anyone has any recommendations for what to look for, specific Xeon Processor models, certain main boards I would greatly appreciate. if the cost VS a threadripper system would be higher let me know too. My currently idea is eventually swap to 1950X (cheapest option.. for 16c ($800 or so) otherwise sell the ryzen and mobo for like 250 ish$ add to $800 gives me 1050$ to play with are there any quad Processor Xeon boards? cheap enough to do like 32 cores? MHZ is a big deal.. sort of, blender is great in multi threaded, but.. Calculations for hair, textures (scene processing for each frame) the moment which for my current scene takes about 10 seconds to process before the render blocks start rendering, slower clocked processors take much longer due to this part usually being only 1-2 maybe few more cores... Would be interested to see whats available, cannot wait for replies :) cheers fellow enthusiasts
- 18 replies
-
- blender
- server hardware
-
(and 3 more)
Tagged with:
-
I am trying to get the two cards to work at once, I am using 2 power supplies using jumper cable in the second. I can only get one card to work at once. the quadro driver is the one installed but that causes the 1060 to get diabetes. I am just needing the two for rendering animations in blender, no gaming. 24 gigs RAM w3565 xeon CPU Windows 10 Pro (infinity glitched key) BIOS Dell A13 works before I install drivers but cant edit any settings ask any questions!
- 5 replies
-
- gtx 1060
- quadro 4000
-
(and 1 more)
Tagged with:
-
Just want to know the best suitable combo for 3D modelling & rendering. And the budget is around $2700. Can someone recommend me CPu and GPu for a PC? Thanks ., link is appreciated!!
- 5 replies
-
- 3d modelling
- sketchup
-
(and 2 more)
Tagged with:
-
Hi, The planned PC will run Linux and its main purpose is to support creating/rendering of animations and stills with Blender. This could put the PC under 100% load for more than two hours. The other purpose is to support normal desktop work and playing videos. My gaming needs got stuck short after Zork/nethack and Tetris...so no ambitions here... The parts I choose initially are: Cpu: RYZEN 5 3600 Mainboard: MSI B450M TOMAHAWK MAX Graphics card: MSI RTX 2060 SUPER VENTUS GP DDR6 PCI-E Ram: 32GB CORSAIR VENGEANCE LPX BLACK DDR4-2666 DIMM CL16 DUAL KIT Case SHARKOON S25 V Power Supply 650 WATT CORSAIR RM650 80 PLUS GOLD FULLY MODULAR ATX POWER SUPPLY (sorry for the all-bold characters...it is a cut and paste from offered parts of a seller here..) I am totally unsure about the CPU-Cooler -- what should I buy? What do you think about the parts list? Another thing is: When Blender is rendering the graphics card is 100% busy. I fear, that the PC is "blocked" for several hours than and no reasonable working is possible in parallel than. So I thought of inserting a second graphics card from my old PC into the new built: A Nvidia GT 1030. Will that work? WIll both cards get enough cooling than? I am "out of bussiness" (building PCs) for about 12 years now...any help is very appreciated! Cheers tuxic
-
Can get them for the same price - 400EUR I also do some light gaming, but only on low resolution (2560x1080), and I know both cards can handle that. I mainly work in Blender and Substance Painter. I don't really do vray rendering, but I might have to use Iray in Substance Painter. But I know vega doesn't support it. Ofc I can just use Blender cycles/EVEE, Luxrender or Marmoset Toolbag instead. Oh yea and I'm planning on putting a NZXT G12 on the RTX 2060 if I choose that. But I'm not sure it's on the support list. I don't really do work in Substance Designer, but I heard it just got support for RTX cards, Allegorithmic may add the same to SP in the future. Should I give up 2Gb of Vram for iray? System spec: FSP Arurum 1000 PSU MSI B450M Mortar Titanium 16GB 3000Mhz Ryzen 5 2600 GTX Titan Black Thank you in advance!
-
I need some advice. I am planning on building a PC later this year and I'm not sure what CPU or GPU to go with. I will need this to be workstation capable: Blender, Unreal Engine 4, Substance painter, Adobe suite, etc. However, I also want it to be a competent gaming PC for downtime, and occasional subbing for TF2 Esports. CPU: I have been looking into the Ryzen Threadripper 2950X (and 3000 series equivalent when released), OR the Intel Xeon line of CPU's. GPU: I have been looking into 2080TI (Expensive!) or just waiting until the next release. (nvidia preferably, as CUDA cores are very important in the applications I use) RAM: at least 32GB 3200 cl16, and if I can justify... 64GB. Monitor Resolution???: I would like to have 4k ips monitors that are mostly color accurate, but when I am gaming it might be an issue. So I might run a multi resolution setup, 4K color accurate and 2K high refresh? What I do know is that NVME m.2 is essential, at least for boot and software. As well as a high capacity performance hard disk such as a barracuda or WD black. A good efficiency PSU (gold or higher). And excellent cooling, both CPU cooler and a high airflow case. Any opinions on the topic are very well appreciated, and thank you for your time.
-
- workstation
- blender
-
(and 3 more)
Tagged with:
-
Hi, i would like to know if blender and Lumion use the same algorithm for renders( i don't know how to describe so i just assume is all like an algorithm), because much of the test are made on 3d mark\blender, so i get in mind maybe i can use the blender test to have a comparison with lumion. Thanks guys, have a nice day.
-
I have a Dell XPS 8930 and I mainly use blenders 3D program, along with the adobe suite but id like to really focus on upgrading my PC to see better performance in blender. Ive always just got a new computer every few years but id rather just uprgade this one a little bit. I have zero experience but my father could help me install some of the components as long as i can figure out what ones i want. I mainly render on my CPU so I think more cores would benefit the most but with more cores do i need more ram? Would rendering off my Graphics card be more beneficial instead of upgrading? Thanks in advance. Here is a list of my specs: CPU: Intel Core i7-8700K CPU @3.70 GHZ RAM: 16gb System: 64 bit GPU: Nvidia GeForce GTX 1070 Ti RAM: I have 2 8gb ddr4s and im thinking about adding two more of the (Crucial 16GB Kit (2 x 8GB) DDR4-2666 UDIMM). They look like they would be compatible alongside my others. Or would it be better to just get 4 8gbs of the same crucial? or maybe only use 2 16 gbs and take out my 8gbs? CPU: I have six cores/12 threads. Ive noticed that the amount of threads really dictates render speeds. Id like to double the amount of cores or threads. I really don't understand how this works at all. from what i understand I cant just add another cpu? I would need a bigger cpu compatible to my motherboard with more cores and possibly have to upgrade my power supply? I would like to just get a better understanding on my options. GPU: I dont know if i would need to upgrade? can i add another?
- 3 replies
-
- blender
- dell xps8930
- (and 4 more)
-
hey there, I dunno how many of you are familiar with blender 2.8, but I'm having an issue with the Boolean > deference modifier. I'm brand new to blender and don't know much but I do know that the aforementioned modifier is supposed to use one object to cut a hole in the other. unfortunately it works the first time but not the second; here's a picture to illustrate my point. as you can see I've cut a circular hole in both inner and outer cylinders, but now I'm trying to cut a hole through them using the cube. that's where I run into issues, I think it has something to do with the cylinder that used to be there and the cube overlapping. I've tried to join them into a single object and using that to cut a hole but no dice. I can't figure this out, anyone know how to accomplish this?
- 1 reply
-
- blender 2.8
- blender
-
(and 4 more)
Tagged with:
-
Heres a little test thing I made for LMG's new channel, ShortCircuit! The idea was to have something that transitions seamlessly with the video Let me know what you think
-
Hello guys, I noticed a weird performance while using Blender 2.80 on my PC. Basically, when I move the camera or do something with the object the first few frames are really choppy, and it gets pretty annoying since moving the camera is probably the thing you most do when modeling. I tested it in ZBrush and I get the same problem, but this time it's more noticeable when hovering the brush over the model. The model itself is not heavy, in fact, it still happens with an empty viewport or a simple cube. Moving panels in Blender have the same effect. I just updated the graphics card driver and had no success. Although it took a lot of time to detect the compatibility (something close to 5 minutes), which I found to be pretty weird. I noticed that when moving the camera the GPU usage spikes from 0 to 50, maybe there's a delay or something that is causing this? I also changed the power cable before noticing this issue but I'm not sure if this can be the problem. My GPU is a EVGA GTX 1080Ti SC Black without overclock, I also have a Ryzen 7 1700 and 16GB on 3000MHz. I'm using a 4K monitor, but it still happens on my 1080p secondary monitor. I did some tests on my PC at work and I have no problems with it and it's running a 980Ti. The GPU spikes when moving the camera hardly go past 10%. I haven't noticed this issue while gaming. Does anyone know a solution for this issue? Thanks!