S3FS Fuse Cache size limited to 64 GB

I had already set up a S3FS Fuse with S3 bucket, but while going through the FAQs of S3FS found that the S3FS Cache is limited to only 64 GB. Just curious to know why this limitation is there and what needs to be done if I need a cache size more than 64 GB?

Hello

Thanks for contacting Wowza Support!

Can you provide where you saw the limitation of cache size?

Here is the Wowza Article on this.

regards,

Jermaine

Hello Jeramine

Good Morning!!! I saw this data in many blogs, one of the link I m pasting here. Please let me know if this fact holds correct or not!!!

https://github.com/s3fs-fuse/s3fs-fuse/wiki/Fuse-Over-Amazon

Regards

Ray

Hello Ray,

That s3fs limit is for a single file.

Regards,

Alex

Hello Jeramine

Good Morning!!! I saw this data in many blogs, one of the link I m pasting here. Please let me know if this fact holds correct or not!!!

https://github.com/s3fs-fuse/s3fs-fuse/wiki/Fuse-Over-Amazon

Regards

Ray

Hello Alex

Thanks for the clarification!!! But in case of a single file size greater than 64 GB, is it possible to upload the same by using MULTIPART upload option supported by S3FS?

Regards

Ray

Having had many many problems with S3FS in conjunction with recorded video, I have to ask what you are trying to accomplish. If you are making large files available to other servers, your better bet is Elastic File System, which is actually a file system, The problem with S3FS and S3 in this context is S3 is not a file system, and while S3FS overlays a file system on the objects, the underlying objects are still objects. Additionally, while S3FS has a good community support network, it is not officially supported by Amazon, so Best Effort Support may not be enough to support you when you have problems.

I use S3FS for sharing configuration files and the like, but when slogging large files around, I prefer to use the API’s that AWS officially supports.

Just my experience and $0.02.

Cheers,

Bob