Amazon Elastic Inference has introduced new Elastic Inference Accelerators called EIA2, with up to 8GB of GPU memory. Customers can now use Amazon Elastic Inference on larger models or models that have larger input sizes for image processing, object detection, image classification, automated speech processing and natural language processing and other deep learning use cases.

from Recent Announcements https://aws.amazon.com/about-aws/whats-new/2019/10/amazon-elastic-inference-introduces-new-accelerators-with-higher-gpu-memory/