I wanted to have your opinion about this structure:
- A random key is generated for every file and encrypt them with AES256
- Every file will be sent to a new random hub as signed
- The private key of the hub is stored in the user hub encrypted with user hub public key for later use
- The private key to the random hub will be sent to the server
- So the server can move the file from a hub to another one and give it to another user. This is for the reason that not every user needed to download and then upload it again.
- Now each user has control over their copy so they can delete it independently, update to a new version with another encryption key and so on.
In this system, the only thing server can do move or delete the files, just like the storage provider. It cants read the file content or index them.
Do you think it is a good practice? Do you think is there any other way to not force the user to reupload every file on the share?