Working with Large Assets

Kinjal P Darji
1 min readMay 5, 2022

--

  • Use wired connection to upload heavy assets.
  • To improve the asset upload time, temp folder should be on RAM/SDD.
  • When there are large amount of assets (more than 500 GB), those should be stored in s3 datastore or file datastore.
  • Disable subset generation workflow.
  • Instead of processing the assets and creating renditions, a solution that offers to modify the asset at runtime should be preferred like Dynamic Media or ImageMagik.
  • Smart tagging can be used to organize the assets.
  • Tune the Granite workflow queues to limit concurrent jobs.
  • Configure buffered image cache to avoid OutOfMemoryError.
  • Make the workflow transient.
  • Offload the workflow.
  • JVM parameters like query limit in memory, query limit reads, update limit and fast query size should be fine tuned.
  • Use HTTPS.
  • Configure DAM Asset workflow to only use the steps those are required.
  • Configure Lucene indexes for metadata if required.
  • AEM can by default upload an asset with the size 2GB, for more than that, a configuration is required in CRXDE. Increase token expiration time and receive timeout.

--

--

Kinjal P Darji

Hi, I am an AEM architect and a certified AWS Developer — Associate.