Using the Python integration, you can view any 3-d NumPy array loaded in memory (or other Python object, like an h5py dataset, that supports indexing like a Numpy array) directly. To do that, you can just modify e.g. example.py.
You can also use the precomputed format (or n5 or nifti formats) to view local data --- in that case you don't need the Python integration (you can even use the public
http://neuroglancer-demo.appspost.com client), but will need to use some other web server to host the data. You can use the cors_webserver.py script for that purpose. Note that if, as is likely, you will be serving the data from an http:// rather than https:// url, you need to make sure that the Neuroglancer client you use is also served from a localhost or http:// url, not an https:// url, due to restrictions placed by web browsers. In particular, if you are serving the data from
http://localhost:8000, make sure to use
http://neuroglancer-demo.appspost.com rather than
https://neuroglancer-demo.appspost.com
You can certainly download one of the public datasets for local testing. You can do that using e.g.:
gsutil -m cp -r gs://neuroglancer-public-data/flyem_fib-25_training2 .
That will download the small flyem_fib-25_training2 image and ground truth segmentation volumes. You can also download the larger ones in the gs://neuroglancer-public-data bucket, but because of the large number of files involved it will take quite a long time using gsutil.