I’ve not been able to find a thread that addresses an issue I’m experiencing. I recently upgraded from a TS119 to a TS262. All is great, except for a visual basic script that no longer works when used with the Windows Scheduler. This script is fairly simply and copies files from the NAS to a usb drive on a Windows PC (Win 10) and vice versa. While I’ve mapped the NAS to a drive on the PC, I learned with the TS119 to use the “\\server\…” reference in the script when using Scheduler to run it. This all worked fine with the TS119. However, it’s not working with the TS262. I’ve tried a number of things, referencing the IP address (\\192.168.1.nnn) and the server name (\\NAS87695B), but neither work. I’ve tried mapping the drive within the script and assigning the credentials that I’ve defined for the NAS, but again, no luck.
Any suggestions on other things I could try would be greatly appreciated.
You should be able to capture the output to a log file and see the exact error message. It is likely just syntax change on the hostname or share, or a user credential issue. You should also be able to run each line interactively and see the output to help identify the issue.
The script is rather long. However, it’s written in vbs. The vbs statement that fails when run by the Scheduler is: “If not(fs.FolderExists(InShare) then”. The variable InShare can be whatever I pass to it. In this case, it’s going to be something like “\\192.168.12.150\Documents\rick\”. Again, this works fine when I run it as an online user, but fails when run via the Scheduler.
No. I don’t have experience with Qsync, but am willing to research and try. Is this something that I can run offline and that the QNAP will accept its credentials?
Not sure how to do this. Where would I active a log file? On the QNAP, or the Windows Scheduler? Again, this script works fine if I run it as an online user; so, not sure how interactive running would apply. It’s when it’s run by the Windows Scheduler that the QNAP denies access.
The vbs script that I’m using and that has issues with my new QNAP does alot of things for me and has functionality that I want. That said, your suggestion re Qsync sounds interesting and relevant. I’ll certainly experiment with it and see if it looks like it can work in my LAN environment. Thanks.
Help me understand what your script does beyond synchronizing files? What you could do is let Qsynch do all the file synchronization. Then if you want to alter file names or dates, etc, you would do that on the PC using your script. The script would just not connect to the NAS. And then any changes made would be sync’d back to the NAS.
I may not understand what you’re asking. I’ve added user credentials on the NAS that align with the user accounts on each PC that I want to provide access. Again, my vbs script works fine when I run it as an online user. In that case, I know that the NAS sees the PC user credentials and provides access. I don’t do anything separate to authenticate the user. If I run the script on a PC that I’ve not added as a user on the NAS, the vbs script will just issue an error that the share wasn’t found. That’s the error I’m getting when I run the script via Scheduler. I know that Scheduler formats the user account differently than for an online user, and that it provides some configuration options for the user account to be used when running the offline script. I’ve experimented with various of these configuration options in Scheduler; and I’ve added additional user credentials to the NAS attempting to add one that will align with what Schedule uses. For instance, I believe that Scheduler formulates the user as “PCName\username”, which of course is different from the account user name used to boot up the PC. So, I’ve added every configuration of this to the NAS credentials I can think of; but am still getting the “share not found” error from the NAS.
If there was a way I could log/capture the credentials that Scheduler is coming to the NAS with, I could verify that I’m adding the proper user credentials on the NAS, but obviously haven’t figured out how to do this yet.
Sorry to be long-winded, but in the chance you can point out what I’m missing … thanks
Some context may help. I’m supporting a non-profit organization that provides services to disadvantaged clients in our community. We capture all activity for our clients on fillable PDF forms that are consolidated and stored on our QNAP device. From here we run reports, consolidate our client master lists, and so on. I actually use the QNAP cloud-based backup functionality to keep a backup of our NAS database, and I love it. I’ve configured it, however, as an exact copy. So, if our staff inadvertently deletes or loses a file, the QNAP cloud backup won’t allow me to recover it. My LAN-based script is my secondary backup. It allows me to have a second backup, as a redundant precaution. It stores backup files on a USB storage device (encrypted) that’s plugged into a PC I use for this purpose. But, I have it configured so that it provides, effectively, a Recycle copy for any files that have been deleted or moved. That way, if someone has mistakenly deleted or lost a file, I can usually find it; and without having to run a QNAP backup restore. Plus, it’s a USB storage device. So, I can access it if need be from any PC I plug it into, provided I remember the password to get past encryption. My script is file-based. It looks for matching files and compares dates. It’ll copy a backup if it’s a new file or if the date shows that the source has changed. And, if it sees a file on the backup that’s not on the NAS, it’ll move it to the Recycle folder.
There’s a little bit more to it, but these are the basics, and the reason I like it.
I really think you could do this using Qsync. You set up “paired” folders that are constantly kept in sync with each other. And you can select what happens in the case of deletions. Now since your USB drive could be removed or changed, what I would do is using Qsync to sync between the NAS and the PC - again this is “live” - a file changes on the NAS, it gets immediately sent to the PC.
The run your script on your PC to copy files from the “Qsync” folder to your USB drive.
Now you don’t have to worry about any messy usernames, etc.
You should also consider using snapshots on the NAS. They are wonderful to recover those accidentally deleted files.
I took a quick spin through the Qsync tutorial and agree that it looks like it could serve my needs. It addresses my desire for a local store to complement my cloud-based backup, and it has options to recover files that have been deleted by mistake. And, of course, because it’s a QNAP app that resides both on the NAS and the local PC, it should resolve my credentials issue.
It looks like I wouldn’t have control over “when” it synchronizes. The only issue there is that I currently can unlock my USB drive during the backup and then lock it back down. Your suggestion that I use the PC drive to handle Qsync and then use my own script to then copy to the USB drive could resolve this, provided the PC has adequate drive space.
Anyway, I very much appreciate the suggestion and tend to agree. I know that this is a more advanced method to accomplish my goals … though it’ll be tough to stop using a script that I built and have come to trust. But, one can’t be averse to change, like I tell my users! Thanks again.
Re: database size, I just checked and the key folder that contains the data we backup is up to 15GB. The script I’ve been referring to runs on an old desktop PC with Win 10 on it. I do most of my support remotely; so will have to check that PC to see how much unused space it actually has on its hard drive. As I’ve noted, the backup I maintain with this PC uses a USB drive for this purpose. It’s at least a TB, so space hasn’t been an issue. Plus, I’ve encrypted the USB drive using the functionality that Windows provides for this. There’s some peace of mind in that since we’re talking about client confidential information. While the USB drive is flexible and can be accessed on any other machine if need be, it’s encrypted and can’t be accessed without the key or password to unlock it.
When I’m next in the office, I’ll see how much space this utility PC has and if it could handle this.
Thanks for all the help and suggestions you’ve been providing.
I just read this FAQ and it sounds promising. While I’ve experimented with adding credentials to the NAS in an attempt to align with how Windows Scheduler will attempt to connect, I’ve not experimented with the credentials on the PC. I’ll give it a try when I’m next able to access the PC.
btw, I’ve been a NAS owner for many years and have used various of the apps and cloud services that it supports. You guys are great! Thanks.