The largest cloud drive I can make is 10TB.. I've looked around at some of the settings and I can't seem to find the ability to make anything larger than 10TB...
How do I do this?
How do I get a larger than 10tb drive?
Occasional Super Slow Connections
Most every read operation finishes so quickly that it's almost impossible to even see the connection speeds for them in the log.
Occasionally, maybe one read per 100gigs, I'll get an incredibly slow read operation download. Occasionally taking over a minute to download the 20MB chunk (longest I've seen was a minute 50), with speeds around 200-500kb/s.
These slow reads tend to block other operations for the program I'm using. This is pretty bad.
To try and circumvent this. I edited the IoManager_ReadAbort line in advanced settings, down from 1:55, to :30 seconds.
However, this command doesn't work as expected, instead of aborting the read and retrying, if a connection exceeds this timeframe, it actually disconnects the drive (unmounts it), and presents the retry and reauthorize options in clouddrive UI. Retry will always reconnect it right away, but this doesn't solve the errant slow connection issue.
I believe the IoManger_ReadAbort would be better suited if it actually just reattempted the read connection on a timeout, instead of assuming a full provider failure.
With that in mind, I propose that if IoManager_ReadAbort is triggered it should utilize the IoManager_ReadRetries variable to attempt a specified number of reconnects.
Alternatively, a new flag, IoManager_MinSustainedReadSpeed (defined in kb/s) could be implemented, to specifically retry connections with very slow read speeds, which would likely detect and rectify these connections quicker than waiting for a timeout period before retrying.
Uploading data to the Cloud from Home, then switch to VPS
Hi,
I have 2 Google Drive accounts with 2 x file duplication in a Drivepool.
I have a load of content I want to upload from home to the cloud.
To save uploading the large files twice, can I mount ONE of these drives at home, upload the content, then detach, reattach to the VPS and have the VPS do the replication without corrupting my drivepool?
How to choose what drive CloudDrive uses for copying/storing local data?
I'm working on archiving/backing up several TB of data that I have on a local network drive by copying it from the network drive into the CloudDrive drive letter. When this happens, once of my local PC's drives is filled up with what I presume is a cache of sorts that is stored for copying off to GDrive over the days-long process each chunk takes to upload.
How can I choose what drive is used for this temporary/cache storage? I can't seem to find the option in the program and I'm sorely disliking how I can't set that as an option.
Limited to no download speeds?
I am using SBCD to store files for a Plex server. I have my drive set up using settings specified in this thread (https://www.reddit.com/r/PleX/comments/61ppfi/stablebit_clouddrive_plex_and_you_a_guide/)
I have download threads set to maximum of 10. I rarely see prefetch being utilized on the UI, and have very poop download speeds. 95% of the time only up to 1 thread is being used at a time and I'm getting ~15mbps down. My internet connection is 75d/15u. I have opened a thread on reddit but haven't gotten much help there currently. Link to thread (https://www.reddit.com/r/PleX/comments/69iy0l/issues_with_buffering_stablebit_clouddrive_google/)
Any ideas or suggestions?
Error Log ..
Error report file saved to: C:\ProgramData\StableBit CloudDrive\Service\ErrorReports\ErrorReport_2017_05_08-08_31_08.3.saencryptedreport Exception: System.Security.SecurityException: Security error. at CloudDriveService.Cloud.Providers.Registry.ProviderRegistryEntry.#Cre(ProviderMetadataBase #9we, Guid #2Ae) at CloudDriveService.Cloud.IoManager..ctor(ProviderRegistryEntry providerRegistryEntry, CloudDrive cloudDrive) at CloudDriveService.Cloud.CloudDrive.#uTf() at CloudDriveService.Cloud.CloudDrives.#Rke() The Zone of the assembly that failed was: MyComputer
Error report file saved to: C:\ProgramData\StableBit CloudDrive\Service\ErrorReports\ErrorReport_2017_05_08-08_31_08.4.saencryptedreport Exception: CoveTroubleshooting.Reporter+ReporterLogException: {reporter_exception} at CoveTroubleshooting.Reporter.ThrowLogReportN(Exception TheException, Object[] TheParams) at CoveUtil.ErrorReporting..(Exception )
Exception: System.Security.SecurityException: Security error. at CloudDriveService.Cloud.Providers.Registry.ProviderRegistryEntry.#Cre(ProviderMetadataBase #9we, Guid #2Ae) at CloudDriveService.Cloud.IoManager..ctor(ProviderRegistryEntry providerRegistryEntry, CloudDrive cloudDrive) at CloudDriveService.Cloud.CloudDrive.#uTf() at CloudDriveService.Cloud.CloudDrives.#Rke() The Zone of the assembly that failed was: MyComputer
Exception: CoveTroubleshooting.Reporter+ReporterLogException: {reporter_exception} at CoveTroubleshooting.Reporter.ThrowLogReportN(Exception TheException, Object[] TheParams) at CoveUtil.ErrorReporting..(Exception )
Anyone know how to fix these error?
Thanks
Nightly drive dismounts
Hi I'm kind of new to this product and I think I've got things setup decently but I'm having some issues.
Pretty much the only thing the cloud drive regularly is plex. so I'll put that out there off the bat. Streaming off of the cloud drive has been working perfectly. I haven't had any hiccups there even remotely streaming a file I know is on the clouddrive.
I find pretty much daily I either run into a bunch of errors being displayed in the upper left hand corner of the app or to be being disconnected from my google drive. Hitting retry when I notice its disconnected it connects up without issue. As far as I can tell my internet connection is fairly solid and not flaky (speed 135/55). I've made sure my router isn't scheduled to restart over night etc. I don't currently have reason to believe I'm dropping my network connection yet.
I did have daily plex library scans turned on and I've got that turned off as of this morning to see if that makes a difference over night. However I was under the impression that cloud drive respects Googles API limits and backs off exponentially. I'm not too worried about things taking a long time to say scan during a library update in plex. More concerned that some action is causing it to be dropped and having to manually reattach the drive.
I'm having troubles reading the logs because the time stamps don't seem to be in my time zone which is making it hard to line up with events that occur on my pc or if I have the logging level set high enough to help. Currently have the google side of things set to warning.
What I'm seeing when I check the service log:
I'm using a small local cache because the data retrieved from the connection is unlikely to be accessed a second time. Aka playing completely random things off of plex. Its unlikely I'm going to watch the exact same thing twice regularly . I'm assuming it will have to stream and pre cache on the fly AKA I have bandwidth to burn. I can bump the cache size up without issue I just didn't because I didn't think I'd see an improvement if things are unlikely to be "played" twice.
Drive size 32TB
Chunk size: I can't seem to find out where with it connected but I seem to remember 20 meg
Local cache: 5 GB adaptive
Settings:
Download threads: 2
Upload threads 2 with background I/O
Upload threshold 2.00 MB or 5 minutes
Prefecher enabled
Prefectch trigger: 10 MB
prefecth Forward: 50 MB
Prefetch time Window: 45 seconds
Thanks!!! So far besides this hiccup i'm very impressed with your product here!
Cache Usage and Disconnects
Hello,
I apologize in advance if there is already a solution to my problem.
I am running a 10TB cloud drive with a 1TB SSD cache drive with a 500GB cache. I was hoping to utilize the entire drive as cache, even still the cache is only 10% filled. The clouddrive contains 896GB, so why isn't this all in cache, until it needs to be flushed.
Given requests and my lack of knowledge in tuning at the moment, I frequently find the drive disconnected in which I need to click 'Retry'
Here is what I am envisioning as desired operation. With such a large cache drive, I should be able to effectively fill it up with data based on requests and have items purged based on new requests, etc. Currently, the cache is only filled to 49.5 GB (10%). Even when the clouddrive becomes slow and or bandwidth constricted based on the provider, I would expect the local drive connection for write to be maintained, given the cache and reads to be allowed, if contained in the cache. Is there a way to tell it to not disconnect and just create a large delay, if not found in the cache and or allow writes to the cache until restored? Can I pin specific items in the cache? This would be helpful.
Thoughts?
Thanks,
Greg
Moving Cache Drive Cache
Hello,
I have a 2TB cache drive, not thinking about it when first creating it i setup the drive as MBT. I want to move to another drive I have using GPT. I would detach and retach the cloud drive but have 1.9TB in the To Upload. I would like to move this cache file to the new drive. My cloud drive is 100TB, i've tried to copy and paste the two cache folders but I get an error that I need an addition 97.5 TB. Any though or do i just need to hold out and upload the 1.9TB then detach?
Thanks
Slow Uploads: I/O Error
Hi Everyone. Just started using Clouddrive with an unlimited Google Drive as my backend cloud storage provider. I keep generating I/O errors and was hoping someone could help me figure out why. Ive tried going down on connections with little improvement. Upload speed usually around 10mbps
I keep getting thousands of errors stating: Your drive is having trouble uploading to Google Drive in a timely manner. Error: This thread was being aborted. This can be caused by insufficient bandwidth or bandwidth throttling.
Current I/O Settings
DL Thread: 1
UL Thread: 4 +Background I/O
Upload throttling 200mbps
Here is an example of my log file
Feature Request: OpenDrive
Hello,
I was wondering if the CloudDrive team could look into adding OpenDrive support. It's a service that offers unlimited storage for only ~$10/Month. I to not know the limitations of their API but I'd like to know if it would support 100MB chunks.
Link: https://www.opendrive.com/api
Is this possible?
Alternatively, does anyone know of any cheap cloud storage that I cloud utilize in CloudDrive with 100MB chunks?
Thank you!
Detached drive mounts RAW on another Computer
I have a Google Drive running perfectly on my main PC but whenever I mount it on my VM it asks me for a format.
Tried resetting settings and databases and that didnt help.
What else can I do in this case?
The drive mentioned above was first created on the same VM which now displays the drive as raw.
CloudDrive + Drivepool = VPS crash
Hi,
My VPS keeps crashing overnight for some reason. I have reason to believe it's CloudDrive / Drivepool causing it. I submitted my logs for both CloudDrive + Drivepool earlier via my email danjames92@hotmail.com.
CloudDrive 1.0.0.870 / Drivepool 2.2.0.651 BETA
No longer able to create new ACD drives in build 870
After my PC took a dive the other day I rebuilt and reinstalled and decided to install the latest version of CloudDrive. After installing I tried to create a new drive since all of my old drives were created on much older builds. It looks as though creating new drives in ACD has been disabled with this build. Is there any way around this? I was using my own developer key in the previous build also. Has that functionality also been removed?
Windows Server 2012 R2 - Cloud Drive Ui has stopped working
Missing Chunk. Is there a way to "fix" ?
Hi all,
I m using Clouddrive for a while now, with Amazon Cloud Drive ...
I know it is not supported (even less now ), but my data is not that "important" so I didn't mind losing files every now and then, or having slow download...
Now since few days I have an issue and wanted to know if there is any change I can resolve it ?
2 Chunk are missing... I see the files in Amazon cloud drive, but am not able to download them... And Clouddrive complain they are not found... and disconnect the disk
Now again, I would not mind lossing the file(s) contain in those chunk... My problem is that I don't know which one they are and if there is any chance I can just at least ignore those chunk ?
Ideally remove the files contains in those Chunk, and just then ignore them.... Problem right now is that i cannot keep my drive up more than few minutes as it just disconnect, not finding them...
Is there a way to clean that up ??
Regards,
Stephane
What is the max Mb upload to Google Drive you have seen?
With 20 threads I am seeing 100 - 130Mb on a machine with gigabit up / down.
I was expecting slightly more than this but don't know if its because there's about 400GB not finished duplicating.
Is anyone seeing higher speeds? Shouldn't I be seeing higher?
Please help me diagnose "slow" speeds.
Please help me get to the bottom of why I'm experiencing slow, in relative terms, compared to my dedicated server 1Gb connection.
Average speeds, this is while something is uploading at 130Mb.
With 20 threads I am seeing 100 - 140Mb on a machine with gigabit up / down. With 10 threads, I see 60Mb max at all times. That seems a long way off from 900 potential.
2 x CloudDrives in a Drivepool with 2 x file duplication (OS + 1 cache on one drive, other drive cache on another drive, both HDD, both encrypted)
Network adapter is set to Public, yet firewall won't let me edit to allow to allow for public networks for some reason. Would this have an impact?
I noticed there are firewall rules that don't have it set to allow public connections. Do I need to enable any?
Min download size: 20MB
Chunk Cache Size: 100MB
Moving CloudDrive folder between providers?
Hi,
Recent events have shown that Amazon Cloud Drive seems to be making some business decisions against third party apps and as a result may be changing their business plan for Cloud Drive as a hole. All my data is on ACD right now, and I'm looking to see if it's possible to move that to GDrive? I'd rather not have to download everything from ACD to GDrive, through StableBit, so I'm wondering if it's possible (using another method) to move the whole CloudDrive folder from ACD to GDrive without any faults?
Is this possible?
Thanks
Mounting a drive after you have downloaded its complete data folder from the cloud....
Is it possible to download the data folder for a drive hosted on Amazon and then mount it locally?