Ram upgrade for new lappy for local ai
7535hs, 4050 16gb ram

Would ram goin gto 32gb make any difference if 4050 is doing the work?
< >
Showing 1-15 of 23 comments
A&A 15 Jun @ 8:09am 
What LLM model are you running? The size matters. :D
Illustriousxl for pictures

Open to recommendations for llms too but so far illustrious is doing really good on 2500u making pictures i like. 2 hours per pic though
Last edited by HypersleepyNaputunia; 15 Jun @ 8:12am
A&A 15 Jun @ 8:24am 
I can't find recommended specs but most likely yes and you have to be sure. When you run the model, check how much RAM it uses. The VRAM will certainly be maxed.
Last edited by A&A; 15 Jun @ 8:25am
_I_ 15 Jun @ 9:34am 
no
the 4050 has its own vram, does not uses system ram

faster system ram would help the intel uhd igpu, and some things
but not games

the igpu is only used to display what the mobile nvidia gpu writes to its frame buffer
faster ram will not help with fps unless the cpus ram is limiting it
that is incorrect. i will explain how it works. your available vram is dedicated vram+half of ram. though if you have an integrated gpu, that takes ram first, before your dedicated gpu. just check nvidia system information in nvidia control panel, which will confirm what i just said. i got it from there.
A&A 15 Jun @ 10:01am 
Originally posted by _I_:
no
the 4050 has its own vram, does not uses system ram

faster system ram would help the intel uhd igpu, and some things
but not games

the igpu is only used to display what the mobile nvidia gpu writes to its frame buffer
faster ram will not help with fps unless the cpus ram is limiting it
For games it's understandable, but self hosted text-to-image models and LLMs can allocate a lot of memory, and 6GB is a huge limitation, so system RAM is important factor, otherwise having to rely on Swap/Page file will reduce the performance further.
Last edited by A&A; 15 Jun @ 10:02am
_I_ 15 Jun @ 10:24am 
if the system/os needs more ram, more ram helps
but faster ram will not help the dedicated gpu at all, it has its own vram
Last edited by _I_; 15 Jun @ 10:24am
A&A 15 Jun @ 10:28am 
Faster RAM was never a question :/
It has minimal impact but the size does for AI.
Last edited by A&A; 15 Jun @ 10:29am
So for a 16gb model i would use 6gb nvidia and 10 gb of normal ram?
I checjed my ram use rn on 2500u
10.8 gb python
306 mb invokaai
Last edited by HypersleepyNaputunia; 15 Jun @ 11:29am
Will it only be a difference of like 10 seconds if i do a $70 upgrade to the 32gb
A&A 15 Jun @ 12:09pm 
Maybe there won't be a difference.
How much memory committed?
11gb
50%cpu usage
A&A 15 Jun @ 1:11pm 
then it won't help.
Originally posted by A&A:
then it won't help.

It could still help in theory (but not likely); it depends on their configuration and which version of Illustrious XL they intend to run. Regardless, Illustrious XL recommends at least 8GB of VRAM for using the GPU for the model. Like you said previously, the 6GB of VRAM is going to be a huge limitation.

Illustrious XL is based on Stable Diffusion XL (SDXL) which the SDXL model will use up to 24GB of VRAM depending on what options you pass it. In general it will easily use 8GB of VRAM even with the lowvram option (which imo significantly reduces the output quality).

Honestly, the OP is still probably going to be waiting hours for images as they are very likely going to still be relying on CPU processing with a 6GB VRAM GPU; and adding another 16GB of system memory isn't going to help it much there.
dedicated vram is faster than shared system ram but, having more system ram still helps. keep in mind that your gpu is limited to half of your system ram. if your system ram is 32, the gpu can use 16 plus what is dedicated.
< >
Showing 1-15 of 23 comments
Per page: 1530 50

Date Posted: 15 Jun @ 8:04am
Posts: 23