[Lilab] [URGENT] Please Clean HPCC Storage

Wei Vivian Li weivivian.li1 at ucr.edu
Wed Dec 17 11:53:56 PST 2025


We have 10T in the bigdata, where ~3.5T have been used to store shared
data, which leaves ~6.5T to host files of all accounts. Given the current
size of our group, it would be safe to try to limit the file size in
individual accounts to ~1T.
Of course, this is also project-dependent, and some projects just require
more storage space. If you are unsure about which files can be removed,
please talk to me.

*Please address these issues today or let me know if you need more time.*
Thanks!





On Wed, Dec 17, 2025 at 10:56 AM Wei Vivian Li <weivivian.li1 at ucr.edu>
wrote:

> Dear All,
>
> It appears that our shared bigdata directory on HPCC is approaching its
> storage limit, which is making access, read, and write operations extremely
> slow for everyone.
>
> Please take the following steps immediately:
>
>    - Check your storage usage in your bigdata directory by running:
>
> du -sch .[!.]* * | sort -h
>
>    - Review large subdirectories and files to determine whether they are
>    still needed.
>
>
>    - Delete intermediate or temporary files that are no longer necessary.
>
>
>    - If you wish to keep older files, move them to your personal device
>    or another appropriate storage location.
>
>
>    - Do not delete original/raw data or key results, even if the project
>    has been completed.
>
>
> Thank you for taking care of this promptly. Please let me know if you
> encounter any difficulty with the storage management.
>
> *Once you are done, please reply to this email with your overall file
> sizes before and after cleaning.*
>
> Best,
> Vivian
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.ucr.edu/pipermail/lilab/attachments/20251217/4180b5c5/attachment.htm>


More information about the LiLab mailing list