Does number of samples affect the GPU memory?

I am trying to train a CNN network for video frame prediction. My images are large (10 * 480 * 1440 * 3). I want to know if the number of samples that I am using for training is going to affect the GPU memory use, or only the batch size (and also network parameters) need to fit into the GPU memory?

The problem is when I load 100 samples for training with batch_size = 1, I can train the model. However, when I increase the number of samples to 200 I run out of GPU memory.

My machine configuration is: GPU: A100 NVIDIA 40 GB memory System memory: 1008 GB

I would appreciate any suggestion to solve this issue.