Project 64 32 Bits

0 views
Skip to first unread message

Karina Edling

unread,
Aug 5, 2024, 10:37:58 AM8/5/24
to condtemodu
Twostage carves, or roughing/detail carves, allow you to use two bits on the same project. This allows you to carve as much of the project as possible with a larger (roughing) bit and then swap in...

In this post, I'm sharing tips & tricks about managing/maintaining an open-source Zig project and mentioning the commonly used practices. I'm also giving a brief introduction to my first-ever Zig project "linuxwave" which led to the writing of this series.


While writing Zig in the past months, I sometimes had a hard time figuring out what to do or which way to go to efficiently do a certain thing. During those times, I either went for a hunt in the wild interwebs or asked a question in the Discord server of Zig. Unfortunately, there is still a lack of documentation so I turned this getting help process into writing blog posts and sharing my findings. Here are the first 2 parts of this series which came out exactly in the way that I described - me trying to figure out things:


In the meantime, I was working on my project and writing Zig at every chance that I get. I must say, I absolutely enjoyed this process since it's been a while since I learned something low-level like Zig. Actually, it's been a while since I learned a new programming language as well. Development was really fun.


However, like every good thing, this had an end. After the core functionality is there and the Zig code is thoroughly tested, I started to do chores that are more related to project management rather than actual development. Although it is sometimes boring, I still believe that doing grunt tasks is also important since it makes you able to efficiently maintain the project and paves the way for new contributors.


So I will be sharing my experience of project management in Zig to document what could be done to achieve a better stance for your Zig project in the open-source community. This could also be perceived as me trying to apply the same open-source maintenance techniques that I use for my Rust projects to a Zig project.


For newcomers, this is one of the prominent questions. Currently, there does not seem to be an easy and standard way like cargo add for adding libraries to your Zig project. However, there are some package managers available for this purpose:


One of the cool things you can do is to track how much of your tests cover your code. This also helps with testing the functionality better and potentially eliminating bugs. Sometimes you even need to refactor your code to write tests for a certain function/module which makes the code better at the end of the day.


The Zig compiler comes with automatic documentation generation. This can be invoked by adding -femit-docs to your zig build-exe, lib, obj or zig run command. This documentation is saved into ./docs, as a small static website.


I don't find it feasible to maintain the docs/ folder in the repository so I added it to the .gitignore. What I want instead is to auto-generate the documentation and deploy it when I push commits to the main branch.


Here is a GitHub Actions workflow file that automates the process of building a binary for a specific target and publishing it on GitHub every time there is a new version tag pushed to the repository:


linuxwave is a command-line tool written in Zig for generating music from the entropy of the Linux kernel (/dev/urandom). It can also encode WAV files as a music composition from a given input file.


Before anyone comments about the title of this post, I must admit that we didn't actually master the project management in Zig completely (yet). But that was the most fitting title that ChatGPT suggested based on my bullet points so I decided to roll with it.


I believe I covered a couple of important techniques and best practices for efficiently managing open-source Zig projects. I might share more stuff about this topic in the future and feel free to let me know if you have additional tips or any questions!


Every summer, we partner with Explore.org on our Beluga Cams to give viewers an immersive and inspiring view into the underwater world of beluga whales. These Arctic whales are a beloved seasonal visitor to the Churchill River estuary that flows near our interpretive center in Churchill, Manitoba. The live cams are part of our goal to inspire people to care about the Arctic ecosystem.


Belugas are highly adapted to life in and around Arctic sea ice. Sea ice forms the base of the food chain for many beluga populations. While belugas are near the top of the food web, they rely on algae, zooplankton, and fishes that are parts of the Arctic ecosystem. Also, sea ice offers the smooth-backed belugas protection from predatory orcas, whose dorsal fins make it difficult to navigate through the sea ice-covered ocean.


Like many projects, Beluga Bits started with a lot of questions about beluga whales followed by putting the pieces together to make it work. I had been studying belugas and other Arctic marine mammals for a few years and happened to be working on another project in Churchill with Meagan Hainstock looking at beluga behavior. While we were out on the water, we would get these tantalizing glimpses of belugas who had scars and marks that we thought could be used for photo identification, a common way to study other whales non-invasively.


Our project has grown in scope every year since 2016. On Beluga Bits, we now have over 25,000 registered participants who have completed nearly 5 million classifications! There have been upgrades and changes to almost every aspect of the project. Our longstanding partnership with Polar Bears International has ensured incredible video footage every year, but they have also been able to expand video collection to other parts of Western Hudson Bay in addition to the in-estuary footage. An exciting development we have incorporated recently is the use of artificial intelligence, specifically machine learning, to help remove photos without belugas before uploading. Too many photos of just water are not that fun for participants to sort through and AI can quickly do this boring task. We partnered with a computer science professor and a student at the University of Manitoba to create an algorithm that can automatically detect frames that contain belugas, and those that do not. We have been using this algorithm since 2021 to pre-sort images that are ultimately uploaded to Beluga Bits.


We have had some awesome breakthroughs in the project thanks to the help of citizen scientists! We have been able to resight two different whales now based on unique marks seen in our photo dataset. One whale (shown in the photo below) has a series of dot-like marks on the side of its dorsal ridge. This whale was resighted in 2021, the third year it has been spotted. Similarly, we had another resighting in 2021 of a whale with a large scar across its melon that was originally seen in 2017. These resightings are so important because we can start to paint a picture of what whales are returning to the estuary, how frequently, and how they are healing!


Western Hudson Bay is also being evaluated as a possible National Marine Conservation Area (NMCA) and information about the species that inhabit the ecosystem are vital to getting this designation. NMCAs are established to regulate activity, use, and protect the species and habitats within them. We are hoping that information from the Beluga Bits program will help inform managers when making this important decision and monitoring the population into the future.


That is a great question without an easy answer. We know belugas are Arctic-adapted species so if the habitat becomes less like the Arctic they will face increased pressures. We also know that belugas are found farther south than Churchill in places like the St. Lawrence River in eastern Canada. These belugas are not doing very well and are listed as endangered but we are not sure how much of that is climate related and how much is due to other human activities. We suspect that as the habitat changes due to climate warming, belugas will face increasing threats as prey distribution and abundance changes, as the ocean becomes more acidic and noisy, and as people expand their activities. Ultimately, beluga populations are only found in places with seasonal sea ice so reductions in this habitat will not benefit beluga.


Project is 48Hz / 24bit

is there any advantage to export a track's audio at 32 Bit Depth? (or a higher bit depth than the recorded project audio)

It's not a final mix and it uses plugins and will be used in another 48Hz / 24bit project.

The mixed audio however, will be final rendered to DVD and CD quality.


When exporting a track at 32 bits, you're just copying it unaltered from its internal format. So if you're sending a file to someone to incorporate into a mix, it makes sense to send them 32-bit files. However, no one would actually notice if you sent them 24-bit files instead, as that's still plenty to push the noise level way below audibility. So there technically is an advantage, but it's extremely small.


Note that converting to a higher bit depth (e.g. your project is at 24 bits and you import a 16-bit file) just means adding some zeroes to the data. Doesn't actually change anything. Cakewalk uses 32 bits (or 64 bits if you're concerned about not using all the memory you bought) for one reason: to preserve accuracy when performing multiplication on the data within the DAW (which just about everything does, from setting faders to adding reverb to inserting an EQ).


Be aware that 16 bit and 24 bit are integer based formats, whereas 32 bit / 64 bit are floating point based formats.



The jump to 32 bit is not the same as the jump from 16 to 24 bit, as being different formats they're not really comparable in the same way.



Generally speaking, exports should always be 16bit or 24bit unless you're going to be doing additional processing to the exported file.


thanks all for use case. The reason I even asked is, the Default for a track export was already set at 32bit, and my project being 48Hz / 24bit was causing confusion as i thought the export would have defaulted to that. I go long periods of time without recording and forget this info and appreciate going back to the forums for info like this.

3a8082e126
Reply all
Reply to author
Forward
0 new messages