Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

How To Download File From Azure Blob Storage Using .net Core

113 views
Skip to first unread message

Vallie Kleinert

unread,
Dec 29, 2023, 12:59:03 AM12/29/23
to
Here, using is a statement that provides the clean code. Here stream is going to create the file stream for the Read, and we will use the blob client to upload it on the Azure, and then we will get the URL from the blob as well which returns the string URL or path where the file is uploaded.



how to download file from azure blob storage using .net core

Download File https://coeconmugi.blogspot.com/?wx=2wYiUd






I am attempting to create a blog where you can post with images. Basically what I would like to do is select image from local machine then on button click to send it to azure blobs using a post method then retrieve and display the image on the web page using a get method.I found this Microsoft documentation:

-us/azure/storage/blobs/storage-quickstart-blobs-dotnet

But I don't know how to apply it to a get and post method.Also I tried this:




However I do not know how to apply this to a web api.



I have many doubts such as:

1. What should the get and post methods be?

2. how do I handle the image on client side in the get request and post request with jquery?

3. In what way do I send image through put and get request (JSON maybe)?

4. How do I get the image to a img tag?

5. How to identify which image belongs to what post



Would really appreciate the help. Sorry in advance if what I ask is not relevant or silly. I am still a student.



Note: I am using .net core 2.1


Open a new terminal in Visual Studio Code (CTRL+SHIFT+`) and execute the following commands:dotnet new slnmv workspace.sln azure-blob-storage-repository-demo.slnCreate an MVC Web ApplicationWe need a web application to interact with our objects.






I was looking through the posts on weblogs.asp.net a few days ago and spotted this one, Corrupted File when using DownloadToStream in Azure blob, in which the author does what I do and realises their silly mistake! The bit I saw that did make me question what was being done was (emphasis mine):


Just like it would make sense to copy the file directly from Server X to Server Y, it makes sense (surely!) to keep the file move operation (which essentially is composed, as in the original blog post, of a Copy and a Delete) entirely within Azure, or as entirely within Azure as possible. So, without further ado, here's my heavily decomposed and completley error-checking free code for moving a file in Azure blob storage, without having the bytes "come down the wire" onto the computer running the code that's initiated the copy/move (requires the WindowsAzure.Storage NuGet package):


Note that we are using the DisableRequestSizeLimit here for demo. Maybe you want to remove this in production apps. As a blobContainerName param we are passing the name of the container we want to store our data in. We created this before when adding the storage in Azure but with our code a new one will be created as well automatically for us.


Accessing Azure blob storage in government regionsusing a storage integration is limited to Snowflake accounts hosted on Azure in thesame government region. Accessing your blob storage from an account hosted outsideof the government region using direct credentials is supported.


Azure blob storage service allows user to create / update / upload unstructured data in Azure. Later, user can access / download this data from anywhere in the world. This data can be Audio, Video, Image, Text or Log file. We can also use these service for streaming video or audio files. Users are directly allow to create / upload blob from Azure portal or developer can also create / upload blob via code. We can access these blobs via HTTP / HTTPS service also.


I had to write some C# downloading multiple blobs from Azure Blob Storage using the Azure.Storage.Blobs NuGet package. To my surprise, no bulk option exists (at least not to my knowledge). Here's a quick summary of how I somewhat achieved this.


As mentioned already, there are no methods to download multiple blobs from Azure Blob Storage using the Azure.Storage.Blobs NuGet package. There's a nice paging API and you can both get all blobs from a container and all blobs with a specified prefix. Common for these methods is that you only fetch metadata about the blobs. Let me illustrate with a simple example to fetch all blobs from a container:


That's it. Blobs are now downloaded in parallel. The code is valid for scenarios where you need to download blobs from within a .NET program only. In case you need to download or upload files from the file system, there's a range of different tools to help you. On the command line, I prefer AzCopy and for a Windows app, I'm using Azure Storage Explorer.


Additionally we need Serilog.Sinks.AzureBlobStorage in order to write to the blob storage file. Serilog uses sink concept for various providers. Sink in computing in essence is a class that receives or consumes call/incoming requests from another object.


Serilog method requires a connection string from your blob storage to be provided as an argument. Plus we need to add UseSerilog() as an extension to the CreateHostBuilder method. This way we set Serilog as the logging provider.


Firstly you need to create a blob (which is short for Binary Large Object) container in Azure Storage or you can create from the code using Connection String. I made the container accesss policy to public so images can be access publicly by passing azure different accessing policy. You can access the Blob Storage using connection string provided by Azure Blob Storage from your application.


My guess it is a bug in the desktop client, the caching of azure blob storage contents and not refreshing the content. uninstalling the client and re installing the client has no effect. I still see file contents for 4 days ago.


When you need SAS token authentication for Azure Data Lake Storage Gen 2, you can use Azure SAS Token Provider for Hadoop. To do that, upload the JAR file to your S3 bucket and configure your AWS Glue job to set the S3 location in the job parameter --extra-jars (in AWS Glue Studio, Dependent JARs path). Then save the SAS token in Secrets Manager and set the value to spark.hadoop.fs.azure.sas.fixed.token..dfs.core.windows.net in SparkConf using script mode at runtime. Learn more in README.


Under "Blob service", take note of the value for "Blob service", specifically the blob endpoint suffix. This is the value after It is typically core.windows.net, but might vary depending on your Azure region or account type.


This is a Class 2 WebDAV Server that keeps data in Azure Blob storage with Data Lake support. This sample publishes a hierarchical folder structure from Azure Data Lake and keeps locks, custom properties as well as file creation and modification dates in Azure Blob properties. It is using Azure AD authentication and Azure Cognitive search for the full-text indexing and search.


This example is grouping, counting the taxi rides by month, and ordering. The use of functions over the date is not optimized at all, but still, this query executes in about one second. This query is an example showing more complex calculations and summaries from the data in the blob storage.


This particular application required the blob storage container to be private, and for the files only to be accessible from the front end application. The React front end used Azure Active Directory authentication to allow users access.

35fe9a5643



0 new messages