Part of my email workflow is to never delete any emails, but instead archive them, aiming for inbox-zero. I really thought that this is a much more common workflow than deleting emails or leaving them as read in your inbox, but it doesn't seem that way at least for Nextcloud Mail users. Does anyone have any workarounds for this where I can ideally hit a key and an email moves from my inbox to All Mail? I'm aware of this thread, but it seems abandoned:
I organize pictures and files on my laptop using DigiKam and the file browser.Files are moved from /Inbox directories to the /Documents and /Pictures archive directories.As I do this, the files are synchronized out of my phone (deleted) and copied to their new location on the Nextcloud server.
The /DCIM, /Documents, /Download, /Movies and /Pictures directories on my phonework like inboxes, so they are synchronized in subdirectories under /Inbox on my computer.I process inboxes on my computer by organizing files under the archive directories /Documents and /Pictures.The files get deleted from the phone, but they become accessible for viewing or downloading with Nextcloud.
This has turned out to be a bad solution because NextCloud does not like you directly manipulating files on the machine that is running the server. It prefers you to do stuff through the web gui or on a separate machine that is running as a client. As I am using my desktop as a server AND as a client, whenever I change a file on the desktop, or I move something around using the File Manager, it gets wonky and out of sync with the nextcloud system as a whole. Basically the intended use for Nextcloud, as I understand it, is to run the server as a server, and leave it alone, while using any of the files or manipulating them etc exclusively through client machines.
So then I decided to try using Syncthing. My plan is as the above picture shows, to use Syncthing to sync the laptop and the desktop, while having nextcloud run on the desktop and make files available through the web interface for sharing etc. Syncthing is having a super hard time syncing these folders though as projects range from a few folders with extremely large binary files (Hi poly 3D scans of landscape using photogrammetry), to many hundreds of thousands of small folders with tiny text files ( such as config folders for VScode that have node module dependencies.)
I also found nextcloud/owncloud slow to update, and other issues with mobile uses made me switch to seafile.
I also found syncthing unacceptable for my purposes ranging from many small files to very large files. Project files were one of the main culprits for me and syncthing, the same as you.
I think first you should define your requirements.
Do you need to run the server on your desktop/workstation? If so, why?
Do the features you lack from owncloud really impact your usage? If so, you may be better off with owncloud/nextcloud.
This repository contains the stable releases of the client. For the cutting-edge, alpha version, see the alpha PPA at nextcloud-devs/+archive/ubuntu/client-alpha.
There is also a beta PPA for the stable release candidates at nextcloud-devs/+archive/ubuntu/client-beta.
On the client that still has the notes export all the notes as a JEX archive (File > Export all > JEX - Joplin export file). Then delete all the notes / notebooks from the client. Then import the JEX. The imported notes / notebooks should be treated as new notes and the client should sync them all back to Nextcloud.
Ps: everything you did afterwards probably messed it up even more.
Nextcloud works extracted from the regular archive oob. just make sure owner and group are correct nothing else. I run it myself, so I know for sure it works without custom templates
I can download a zip file using an API call. Now what I would like to to is to have an unzipped directory on my nextcloud instance and download it via an API call. It doesn't matter to me if I get a zip file back when I do the API call, I just want to have it unzipped in the cloud.
So I assume that the WebDav API of nextcloud doesn't allow me to do it in the way I want to do it. Not I am wondering what I am doing wrong or if what I want is doable. Also I don't get why in the browser this works fine. I just select the unzipped folder and can download it with a click. In the nextcoud community it has been suggested to use Rclone ( -complete-directory-from-nextcloud-instance/77828), but I would prefer to not use a dependency that I have to set up on every machine where I want to run this code.
How are you set up? Are you running the same web server for nextcloud and erpnext? What did you try? What does your config look like? How do you know it is not working? What errors are you getting?
I have installed erpnext manually on port 80 & 443 with dns multitenant setup on & accessing it via domain name (forward to my server ip.)
I installed erpnext first successfully & then ran setup of nextcloud, while setting up, I am trying to access it via a cname entry, but it shows me frappe error page rather than nextcloud login page, maybe I am doing wrong.
Once downloaded, extract the archive with unzip (if somehow the above command doesnt work, kindly download nextcloud & use filezilla or any other client to copy nextcloud.zip file to user folder, then use unzip)
The -d option specifies the target directory. NextCloud web files will be extracted to /usr/share/nginx/nextcloud/ . Then we need to change the owner of this directory to www-data so that the web server (Nginx) can write to this directory.
I had a old server so i creat a new server the new server had the same setup as on Ubuntu, Ngix etc i copied the ds.conf from old to new server. i was getting error that nextcloud can not access 403 So what i noticed the old server had set $secure_link_secret verysecretstring; in the ds.conf and the new server had set $secure_link_secret kkjnjkok1kk1k1nkj1; so when i copied the old ds.conf set $secure_link_secret kkjnjkok1kk1k1nkj1; got removed so i had to update the following from set $secure_link_secret verysecretstring; $secure_link_secret kkjnjkok1kk1k1nkj1; and 403 went away . Why is this ? Before the server accepted $secure_link_secret verysecretstring;
This is our final step. Our tunnel is configured and the Nextcloud container was setup and updated to use our custom domain. We verified ownership of the domain and created a CNAME record for the domain nextcloud.packetdemo.com that is pointing the hostname assigned to our tunnel small-dust-63699.pktriot.net. Now we just need to create a rule that will request traffic to our custom domain to be relayed to our Packetriot client and then proxied upstream to our container.
Next, create the backup archive using the tar command to compress a gzip file and display verbose output to the screen. The new archive will be called owncloud.tar.gz and will contain the entire owncloud/ directory. Execute the following command:
At the Nextcloud release site you will find a list of every Nextcloud release in a number of different formats. Find the most recent .tar.gz file for the release that is the same as, or one major version after, your current ownCloud version. For example, if you are migrating from the ownCloud 9 One-Click installation you would be looking for the file nextcloud-10.0.2.tar.bz2.
One consquence of moving files with the sudo command is the files will all be owned by the root user. Nextcloud, however, is always run by the www-data user. This means you need to change the ownership of the /var/www/nextcloud folder and its contents before you go any further. To do this run the chown command with the -R argument to recursivly change all of the file ownerships to the www-data user:
With all of the files in place, you can initiate the internal upgrade process. Nextcloud and ownCloud provide a tool to manage and upgrade installations called occ. Navigate to the /var/www/nextcloud/ directory:
Next, replace all instances of owncloud in the configuration file with nextcloud. You can do this by opening /etc/apache2/sites-available/000-nextcloud.conf with a text editor and making the changes yourself, or by using regular expressions and the sed command.
If you decide to switch back to ownCloud you can restore the data/ and config/ folders from the backup you created in Step 1, as well as any external database you backed up. Do not try to copy the data/ and config/ folders from /var/www/nextcloud back to ownCloud. Once the backups have been restored, all you have to do is disable the Nextcloud vhost and enable the ownCloud one, using the same procedure in Step 4.
We recommend configuring your NextCloud instance to increase the max chunk size to 1 GB for better upload performances.See _manual/configuration_files/big_file_upload_configuration.html#adjust-chunk-size-on-nextcloud-side
If you selected a single file, it will prompt you to confirm the download. If you have chosen more than one file, NextCloud will place all of the selected files into a zip archive. Before you can use the files, you will need to extract them from the archive. Once you have downloaded your file or extracted your archive, you are ready to use your files on your machine.
It is assumed that you have already installed Apache from the Debian repository. After you download the Nextcloud archive from the official website and extract it in the Nextcloud root directory, you can notice that there is a preconfigured .htaccess file in the root directory. That file contains specific settings needed by Nextcloud and you should leave it as it is. However, to fully configure Apache to serve Nextcloud and its applications, you should follow the steps explained below. You can serve Nextcloud on a subdomain, like cloud.example.com, which we recommend, or on a subdirectory, like example.com/nextcloud. Both situations are described below.
08ab062aa8