Why is my dedicated GPU using only half of the available VRAM? I know it has 4 GB available RAM, but then it says 99% used and it shows 2 VRAM is
![cuda - Can CPU-process write to memory(UVA) in GPU-RAM allocated by other CPU-process? - Stack Overflow cuda - Can CPU-process write to memory(UVA) in GPU-RAM allocated by other CPU-process? - Stack Overflow](https://i.stack.imgur.com/92Squ.jpg)
cuda - Can CPU-process write to memory(UVA) in GPU-RAM allocated by other CPU-process? - Stack Overflow
![Force Full Usage of Dedicated VRAM instead of Shared Memory (RAM) · Issue #45 · microsoft/tensorflow-directml · GitHub Force Full Usage of Dedicated VRAM instead of Shared Memory (RAM) · Issue #45 · microsoft/tensorflow-directml · GitHub](https://user-images.githubusercontent.com/15016720/93714923-7f87e780-fb2b-11ea-86ff-2f8c017c4b27.png)