So I just wanna ramble a little here:
The cost is about $200/month at the high end right now, at least in my case. I tend to hit token limits pretty quickly on lower tiers.
Any of the “pro” subscriptions from the big three are generally enough for coding. If you want API-level access, expect something like $50–$100 per 8 hours of actual usage.
Out of the three, Anthropic’s models have been noticeably better at coding in my experience, though that opinion may be outdated. It’s just as likely that I’ve learned how to work around Anthropic’s quirks better than those of OpenAI or Google’s models.
So now, choose how you want to use it: vibe coding or treating it like a coding monkey.
In the vibe coding case, you describe the big picture and hope for the best. This works up to a point, but once the codebase grows, things start to fall apart. The issue isn’t raw context length, it’s that the coherence of your original design intent starts to dissolve. The model keeps producing code, but it stops being your code in any meaningful architectural sense.
In the other mode, you treat it like a junior dev. That means extremely detailed instructions, strict requirements, constant review, and re-explaining context over and over because it has no real memory of your project. Also, when it’s wrong, it’s very confidently wrong.
So you still need to be able to architect and plan complex systems yourself, and you still need to do serious due diligence in code review. Otherwise you end up with things like a “dynamic array” that passes tests for 100 elements by literally using 100 if-statements returning fixed-size buffers. It met the test, just not the intent.
Anyone claiming otherwise isn’t seeing the full picture. It’s not really hype so much as a misunderstanding of what programming actually is. Most of the work has always been thinking, not typing.
If you're coming at this without a dev background, vibe coding is probably your entry point. It can genuinely get you surprisingly far on a simple webapp. Just go in knowing the floor will eventually drop out, and the more complex your requirements, the sooner that happens
For me personally, it saves maybe ~30% of my effort once you factor in planning and verification. The real benefit is that reviewing and steering code is mentally easier than writing and debugging everything from scratch.
It’s a solid tool. Just not a magic one.
The closest thing to “magic” shows up when you pair a strong developer with a stack they don’t know. An LLM can fill in the syntax, patterns, and boilerplate well enough that you effectively bypass most of the ramp-up time.
--
You received this message because you are subscribed to the Google Groups "LVL1 - Louisville's Hackerspace" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lvl1+uns...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/lvl1/CAP16ngq3t9Aw7VqkwsQStGPq19OchNzssDzs-vpVj6fX4zdASw%40mail.gmail.com.
To view this discussion visit https://groups.google.com/d/msgid/lvl1/CAAEnSg-PPQKRFW7NususbshZPBwevx0inX67MY4sLw-UqmkEpA%40mail.gmail.com.
To view this discussion visit https://groups.google.com/d/msgid/lvl1/CAAEnSg-PPQKRFW7NususbshZPBwevx0inX67MY4sLw-UqmkEpA%40mail.gmail.com.
To view this discussion visit https://groups.google.com/d/msgid/lvl1/CAEhNht7CEZLm%2B43r44Bn9YLD3qySjWheDyHqh-Qzg-EbspaCxQ%40mail.gmail.com.
go through our chat history is there some place where we discuss zero knwoledge crypto for a dms software?
create a architecture/design document of the DMS system we discussed.....
focusing on the core stack components
the initial app shell based on oauth2, account creation validation, reliable key exchange between the two parties, and QR code/hash based validation/confirmation of the keys exchanged by the end user to avoid man in the middle attack
This document outlines the architecture for a zero-knowledge Document Management System (DMS) and electronic document signing service. The foundational concept relies on an implementation where a Requestor and a Signer interact without the underlying service provider ever having access to the plaintext data. In an era of increasingly severe data breaches and strict regulatory requirements (such as GDPR and HIPAA), this architecture ensures that a compromise of the central servers results only in the exposure of useless ciphertext.
Key elements from our initial planning include:
Zero-Password Accounts: Utilizing federated identity (OAuth2) to remove the friction and security risks of traditional passwords, effectively eliminating the threat of credential stuffing and password reuse attacks.
Separation of Cryptographic Duties: Generating distinct, separate key pairs for signatures (identity/integrity) and encryption (confidentiality). This separation is a cryptographic best practice, ensuring that a compromise of an encryption key does not automatically allow an attacker to forge identity signatures.
Blind Routing: The Signer submits their public keys to the Requestor. The Requestor signs and encrypts a data blob using those keys, routing it through the Service Provider. The Service Provider acts strictly as a secure courier and storage mechanism—able to see metadata like file size or routing timestamps, but entirely blind to the sensitive contents of the documents.
Market Positioning: This architecture was designed with an understanding of common search terms businesses use when evaluating secure document signing services. By focusing on "end-to-end encryption," "zero-knowledge," and "cryptographic proof of identity," the technical features align perfectly with high-security enterprise needs, differentiating the product from basic, server-trust e-signature platforms.
The architecture enforces a strict separation of concerns where the server acts purely as a routing and storage mechanism for encrypted blobs, while all critical cryptographic operations occur exclusively on the client-side.
Frontend (Client Application): A Single Page Application (SPA) built with React, utilizing the Mantine component library for a highly polished, accessible, and responsive user interface. This layer is responsible for local key generation, cryptographic signing, encryption/decryption, and hashing. It relies on the native WebCrypto API for secure, hardware-accelerated operations, or an audited WebAssembly crypto library (e.g., libsodium) to ensure memory-safe execution of complex cryptographic algorithms directly in the browser.
Backend (Service Provider): A high-performance API built with FastAPI (Python) that handles routing, permission management (verifying who is authorized to send a blob to whom), and database interactions. It seamlessly integrates a tus server protocol implementation to support reliable, resumable uploads for potentially massive encrypted document blobs. Because it never processes plaintext, the backend requires significantly less computational overhead and carries a drastically reduced liability profile.
Persistent Storage & File Hosting: PostgreSQL serves as the primary relational database, providing robust, ACID-compliant persistent storage for user identities, public keys, relational metadata, and audit trails. This is paired with an Amazon S3 (or S3-compatible) object storage system for housing the actual encrypted large file blobs. Even in the event of a total database and S3 exfiltration by a malicious actor, the data remains protected, as the cryptographic keys required for decryption exist only on the end-users' devices.
Caching & Real-Time Communication: Redis is utilized as a high-performance, in-memory cache store and message broker. It accelerates frequent database queries (like public key lookups), manages fast-access session states, and powers real-time communication channels (e.g., via WebSockets). This allows the system to instantly notify active clients when a new encrypted document has been routed to them or when a signature is completed, without relying on continuous HTTP polling.
Background Processing: Celery handles asynchronous background workloads. This includes dispatching secure routing notifications to users (via email or SMS), maintaining complex audit logs, and safely purging expired encrypted blobs from S3, all without blocking the main FastAPI application or impacting the user's response times.
Identity Provider (IdP): An external OAuth2 provider (Google, Microsoft, Apple) to handle the zero-password authentication layer, offloading the heavy lifting of account security, multi-factor authentication (MFA), and identity verification to trusted industry giants.
To maintain a frictionless "zero-password" experience while ensuring secure account creation and validation, the application relies entirely on federated identity.
OAuth2 Flow: The user accesses the app shell and initiates an OAuth2 Authorization Code flow with PKCE (Proof Key for Code Exchange). PKCE is specifically utilized to protect Single Page Applications from authorization code interception attacks, ensuring that only the client that initiated the request can exchange the code for a token.
Token Issuance: The IdP returns a verifiable identity token (JWT) containing the user's standardized profile information.
Validation & Account Creation: The FastAPI backend verifies the JWT signature against the IdP's published public keys. If the signature is valid and the user is new, a user record is created in PostgreSQL containing their verified email and unique ID.
Session Establishment: The backend issues a short-lived session token to the client (often cached in Redis for extremely fast validation). This token only grants access to the routing service (allowing the user to upload or download blobs). It has absolutely no connection to, or authority over, the user's local cryptographic keys.
Upon successful authentication, the React client application generates the necessary cryptographic material locally. It is a fundamental rule of this architecture that private keys never leave the client device in plaintext.
Signature Key Pair: Used for proving identity and ensuring document integrity. We utilize Ed25519, a public-key signature system carefully engineered at several levels of design and implementation to achieve very high speeds without compromising security against side-channel attacks.
Encryption Key Pair: Used for securing the document payload. We utilize X25519 (an elliptic curve Diffie-Hellman key exchange) which provides highly efficient and secure encryption.
Local Storage: Private keys are stored securely within the browser. To mitigate Cross-Site Scripting (XSS) risks, keys are stored in IndexedDB using the WebCrypto API with the extractable flag set to false. This prevents malicious scripts from easily exfiltrating the raw private key material. For mobile wrappers, hardware-backed secure enclaves (like Apple's Secure Enclave or Android's Keystore) are utilized.
Public Key Registration: The client sends both the Signature Public Key and the Encryption Public Key to the backend, associating them with their OAuth-verified identity in PostgreSQL for other users to query.
The most vulnerable phase in any end-to-end encrypted system is the initial exchange of public keys. A Man-in-the-Middle (MitM) attacker could intercept the backend request and seamlessly substitute their own public keys, allowing them to decrypt the document, read it, re-encrypt it with the true recipient's key, and pass it along undetected. We utilize out-of-band hash/QR validation to neutralize this threat.
Initiation: The Requestor queries the backend for the Signer's public keys.
Delivery: The backend returns the Signer's Public Signature Key and Public Encryption Key (often served quickly from the Redis cache).
Fingerprinting: The Requestor's client locally computes a cryptographic hash (e.g., SHA-256) of the combined public keys. This hash is visually represented as both a short, human-readable alphanumeric string and a scannable QR code to accommodate different verification scenarios.
Out-of-Band Validation:
In-Person: The Requestor uses their device's camera to scan the QR code displayed on the Signer's device (which contains the Signer's self-generated hash of their own public keys).
Remote: The Requestor contacts the Signer via an independent, secure channel (e.g., a phone call, SMS, or Signal message) and reads off the alphanumeric hash. The Signer confirms it perfectly matches what is displayed on their screen.
Confirmation: Once validated, the Requestor's client explicitly flags the Signer's public keys as "Trusted" in their local state, ensuring all future communications with this user are secure against interception.
Once the public keys are exchanged and validated, the core document signing and routing process begins. This flow utilizes a hybrid encryption model for optimal performance with large files.
Preparation: The Requestor prepares the document and locally generates a one-time, highly secure symmetric encryption key (e.g., AES-GCM). AES-GCM is chosen because it provides both data confidentiality and authenticity (verifying the ciphertext hasn't been tampered with).
Encryption: The potentially large document is encrypted quickly using this symmetric key within the client's browser.
Key Wrapping: Because symmetric keys cannot be shared safely in the open, the symmetric key is then "wrapped" (encrypted) using the Signer's validated Public Encryption Key. This is much faster than attempting to encrypt a large document directly with asymmetric cryptography.
Signing: The Requestor signs the completely encrypted payload (or a hash of the ciphertext and metadata) using their own Private Signature Key. This creates a non-repudiable proof that the Requestor authored this specific encrypted blob.
Routing & Upload: The Requestor packages the encrypted document, the wrapped symmetric key, and the digital signature into a unified data blob. This blob is uploaded to the backend using the tus resumable upload protocol, ensuring that large files can be paused and resumed without failure, even on unstable connections. The FastAPI backend validates the upload permissions, records the metadata in PostgreSQL, and directly stores the physical blob into S3 storage. Concurrently, a Celery background task may be triggered to notify the Signer via email or SMS, and a real-time Redis pub/sub event is fired to instantly update the Signer's dashboard if they are online.
Retrieval & Decryption:
The Signer queries the backend and pulls the encrypted blob from S3.
The Signer first verifies the Requestor's signature using the Requestor's Public Signature Key (which was previously validated via the QR/Hash method). If the signature fails, the document is rejected as tampered or forged.
The Signer uses their Private Encryption Key to unwrap the symmetric key.
Finally, the Signer uses the unwrapped symmetric key to decrypt the document, completing the zero-knowledge transaction safely on their local device.
To view this discussion visit https://groups.google.com/d/msgid/lvl1/CAP16ngo8QuvB2eWWhKDLKO6jXPH-8Eys8v%3DqFdGkvPvk_DtgWQ%40mail.gmail.com.
To view this discussion visit https://groups.google.com/d/msgid/lvl1/CAP16ngo8QuvB2eWWhKDLKO6jXPH-8Eys8v%3DqFdGkvPvk_DtgWQ%40mail.gmail.com.
To view this discussion visit https://groups.google.com/d/msgid/lvl1/CAAEnSg-U-Bp6Y8GRAcQ3nik7WZwuK-pFA6dCqtRUPP7yzHbz0A%40mail.gmail.com.
To view this discussion visit https://groups.google.com/d/msgid/lvl1/CAP16ngqtxrHyrUN2adWL8rHrUshphMcQQ6CeGj3-aR9kna7adg%40mail.gmail.com.
To view this discussion visit https://groups.google.com/d/msgid/lvl1/CAEhNht57tCO8VXxVjxx0ToHUTxc9-vpo%2BW_5padabqfq%3DjJ8sA%40mail.gmail.com.