Trending

#Async

Latest posts tagged with #Async on Bluesky

Latest Top
Trending

Posts tagged #Async

🦀 Rust Tip #001

let net = Arc::new(Mutex::new(policy_net));
tokio::spawn(async move {
let guard = net.lock().unwrap();
guard.forward(&input)
});

Un type `!Sync` ne peut pas être partagé entre threads via `&T`.

#async #concurrence #pytorch

0 0 0 0
Preview
Some developers still think async/await makes code faster. I thought the same when I started with .NET. Then one day, our API started slowing down under load. Not because of the CPU. Not because… | Stefan Đokić | 27 comments Some developers still think async/await makes code faster. I thought the same when I started with .NET. Then one day, our API started slowing down under load. Not because of the CPU. Not because of the...

www.linkedin.com/posts/djokic... - #dotNET #async & #await doesn't make it faster, it makes it so other things can happen while it waits.

0 0 0 0
Etoile's stream flyer today: "Blar Brigade Async Monthly" The doodle of the day is of Etoile dragging along a bag in one hand of various items from games, her other hand holding a treasure map, facing ahead looking very confused.

Etoile's stream flyer today: "Blar Brigade Async Monthly" The doodle of the day is of Etoile dragging along a bag in one hand of various items from games, her other hand holding a treasure map, facing ahead looking very confused.

🟢BLAR!!!
Wowweee~!
Time for another monthly async!
Likely gonna be a cozy stream ^ ^

twitch.tv/etoiledoll
#dragongirl #vtuber #transrights #randomizer #async

6 3 0 0
Preview
Correcting Common Async/Await Mistakes in .NET 10 - Brandon Minnick - NDC London 2026 This talk was recorded at NDC London in London, England. #ndclondon #ndcconferences #developer #softwaredeveloper Attend the next NDC conference near you:…

Correcting Common Async/Await Mistakes in .NET 10 - Brandon Minnick - NDC London 2026

buff.ly/4KWZDgS

#dotnet #ndc #async #csharp #dotnet10

2 2 1 0
CS5610 Midterm Evaluation - Yi-Peng Chiang
CS5610 Midterm Evaluation - Yi-Peng Chiang YouTube video by Catherine Chiang

Stop UI freezing! ❄️🚫

My Midterm Demo shows how Async/Await keeps apps responsive.

I live-coded a 5s background fetch to prove you can still highlight text & scroll while data loads. No more zombie pages! 🧟‍♂️

See the Live Coding here: www.youtube.com/watch?v=G_vC... 💻✨

#JS #WebDev #Async #Coding

2 0 0 0
Original post on hispagatos.space

Traduccion toot anteriro de @rek2

recordatorio a todos los #usenet #hackers #manipulador de linux y #kunfu #ninjas de terminal que estan en nuestro servidor Usenet #inn2 que tenemos noticias locales en vivo sobre #hackercultura y #seguridad que pueden consultar sin tener que desplazarse hacia […]

0 0 0 0

I'm glad to report that stepping through #async #vala is every bit as annoying as stepping through async #rust .

#debugging #rustlang #gdb

0 0 1 0
Post image Post image

TIME FOR ANOTHER BIG DRAGONPELAGO!
22 DRAGONS
51 GAMES!
STARTING NOW AS WE BEGIN DRAGONPELAGO #5! JOIN ME AND MORE DRAGONS FOR KICKOFF AS WE GET THIS #Randomizer GOING!
#Archipelago #Retro #Dragons #ASync

5 3 0 0
Post image

What is asynchronous messaging? In this post, Kyle McMaster explains the concepts within asynchronous messaging and mentions some of the frameworks including RabbitMQ, NServiceBus, and MassTransit. Check out this post here:

https://bit.ly/4bTRfAS

#SoftwareDevelopment #async

0 0 0 0
Preview
Laurent Kempé - Exploring .NET 11 Preview 1 Runtime Async: A dive into the Future of Async in .NET .NET 11 Preview 1 ships a groundbreaking feature: Runtime Async. Instead of relying solely on the C# compiler to rewrite async/await methods into state machi...

Exploring .NET 11 Preview 1 Runtime Async: A dive into the Future of Async in .NET | by Laurent Kempé

buff.ly/6p27k3g

#dotnet #async #csharp #programming

5 1 0 1
Preview
Laurent Kempé - Exploring .NET 11 Preview 1 Runtime Async: A dive into the Future of Async in .NET .NET 11 Preview 1 ships a groundbreaking feature: Runtime Async. Instead of relying solely on the C# compiler to rewrite async/await methods into state machi...

🚀 .NET 11 Preview 1 brings Runtime Async enabled by default + Native AOT. Fewer allocations and clearer async stacks — try it! ⚡️ Runtime-managed async replaces heavy compiler state machines — meaning smaller heaps and cleaner traces.

#dotnet #csharp #async

6 1 0 0

Wenn du schon eine Weile mit Java arbeitest, bist du sicher ü...

magicmarcy.de/completablefuture-asynch...

#CompletableFuture #asynchrone_Programmierung #async #asynchron #Threads #Hauptthread #Verkettung #kombinieren #exceptionally #Parallelität #reaktiv

0 0 0 0

Scale your apps beyond single-core with the Worker class.

Run CPU-intensive tasks as separate helpers (Console apps) and report progress back to the UI.

UI stays fluid. Multi-core performance unlocked.

#XojoWorker #Performance #Async

2 0 0 0
Preview
Niteo IRL#17 – Taiwan The year on the calendar turned to 2026. Time for New Year’s resolutions and improving IRL 17 in Taipei, Taiwan. I got to Taiwan a few days before the IRL, and my first impressions weren’t great. Going from warm, sunny Thailand to cloudy, windy, rainy Taipei wasn’t a pleasant change. Luckily, the weather cleared up […]

New blog post: IRL #17 in Taipei
New project, two museums, a hike, street food, and a cycling tour on the last day.
Read the blog: niteo.co/blog/irl17/

#remotework #async #retreat

0 0 0 0
Preview
Linting intra-task concurrency and FutureLock | farnoy.dev Clippy lint that identifies wrong uses of intra-task concurrency and prevents cases of FutureLock.

New post, on my brand new blog!

farnoy.dev/posts/future... #rust #async

1 0 0 0
Preview
Ariel OS

My #FOSDEM discovery: #ArielOS .
#Rust, both #async and preemptive scheduling, microcontroller-friendly. Based on #embassyrs .
Sounds promising, gonna have to check it out once I'm back home.

https://ariel-os.org/

1 1 0 0
Preview
Ryuichi Sakamoto – async: Where Silence Becomes Memory - Quiet Thunder Reviews A quiet, reflective review of Ryuichi Sakamoto’s async—an album where fading sound, memory, and fragile textures shape a final masterpiece of silence and presence.

I’ve written a new English review on Ryuichi Sakamoto’s async—
an album where noise fades, memory begins, and silence becomes a final instrument.
A quiet space for anyone who listens deeply.
#RyuichiSakamoto #async #QuietThunderReviews
Review:
quietthunder.hatenablog.com/entry/music/...

0 0 0 0
TypeScript 编译器 Go 重写版提速 10 倍:微软团队深度揭秘幕后工程细节 - Tony Bai 本文永久链接 - https://tonybai.com/2026/01/27/typescript-compiler-go-rewrite-10x-speed-microsoft-details 大家好,我是Tony Bai。 “JavaScript 是一门很棒的语言,但它并不是为了编写编译器而设计的。”

TypeScript 编译器 Go 重写版提速 10 倍:微软团队深度揭秘幕后工程细节 本文永久链接 – tonybai.com/2026/01/27/typescript-co...

#技术志 #AbstractSyntaxTree #ast #async #Compiler #Concurrency #Consstring #CrossPlatform #electron #Figma #FunctionColoring

Origin | Interest | Match

1 0 0 0
Avoiding common pitfalls with async/await - Stephen Cleary - NDC Copenhagen 2025
Avoiding common pitfalls with async/await - Stephen Cleary - NDC Copenhagen 2025 This talk was recorded at NDC Copenhagen in Copenhagen, Denmark. #ndccopenhagen #ndcconferences #developer #softwaredeveloper Attend the next NDC conference near you:…

Avoiding common pitfalls with async/await - Stephen Cleary - NDC Copenhagen 2025

www.youtube.com/watch?v=a_O5...

#dotnet #async #csharp #programming #devcommunity #ndc

5 2 0 0
Preview
Building High-Performance APIs with FastAPI and Async Python Learn how to build high-performance APIs with FastAPI and Async Python. Discover async endpoint design, middleware, background tasks, and benchmarking techniques for scalable web services.

Building High-Performance APIs with FastAPI and Async Python:
dasroot.net/posts/2026/0...
#FastAPI #Async #Python #Redis

3 1 0 0
GitHub - thomaswitt/aws-sdk-http-async: Async HTTP handler plugin for the AWS SDK for Ruby, built on async-http Async HTTP handler plugin for the AWS SDK for Ruby, built on async-http - thomaswitt/aws-sdk-http-async

I just released the aws-sdk-http-async gem: github.com/thomaswitt/a... - an Async HTTP handler plugin for the AWS SDK for Ruby, built on async-http

More information on the background: thomas-witt.com/blog/aws-sdk...

#ruby #rails #aws #async #fibers

0 0 0 0

with minimal modifications, provided that developers account for concurrency considerations such as shared mutable state."
I wish Python had followed the same path; the "async/await" keywords are a nightmare. 🥲
#python #async #programming

0 0 0 0
Original post on hollo.social

#Optique 0.9.0 is here!

This release brings #async/await support to #CLI parsers. Now you can validate input against external resources—databases, APIs, Git repositories—directly at parse time, with full #TypeScript type safety.

The new _@optique/git_ package showcases this: validate branch […]

0 3 0 0
Designing type-safe sync/async mode support in TypeScript I recently added sync/async mode support to Optique, a type-safe CLI parser for TypeScript. It turned out to be one of the trickier features I've implemented—the `object()` combinator alone needed to compute a combined mode from all its child parsers, and TypeScript's inference kept hitting edge cases. ## What is Optique? Optique is a type-safe, combinatorial CLI parser for TypeScript, inspired by Haskell's optparse-applicative. Instead of decorators or builder patterns, you compose small parsers into larger ones using combinators, and TypeScript infers the result types. Here's a quick taste: import { object } from "@optique/core/constructs"; import { argument, option } from "@optique/core/primitives"; import { string, integer } from "@optique/core/valueparser"; import { run } from "@optique/run"; const cli = object({ name: argument(string()), count: option("-n", "--count", integer()), }); // TypeScript infers: { name: string; count: number | undefined } const result = run(cli); // sync by default The type inference works through arbitrarily deep compositions—in most cases, you don't need explicit type annotations. ## How it started Lucas Garron (@lgarron) opened an issue requesting async support for shell completions. He wanted to provide `Tab`-completion suggestions by running shell commands like `git for-each-ref` to list branches and tags. // Lucas's example: fetching Git branches and tags in parallel const [branches, tags] = await Promise.all([ $`git for-each-ref --format='%(refname:short)' refs/heads/`.text(), $`git for-each-ref --format='%(refname:short)' refs/tags/`.text(), ]); At first, I didn't like the idea. Optique's entire API was synchronous, which made it simpler to reason about and avoided the “async infection” problem where one async function forces everything upstream to become async. I argued that shell completion should be near-instantaneous, and if you need async data, you should cache it at startup. But Lucas pushed back. The filesystem _is_ a database, and many useful completions inherently require async work—Git refs change constantly, and pre-caching everything at startup doesn't scale for large repos. Fair point. ## What I needed to solve So, how do you support both sync and async execution modes in a composable parser library while maintaining type safety? The key requirements were: * `parse()` returns `T` or `Promise<T>` * `complete()` returns `T` or `Promise<T>` * `suggest()` returns `Iterable<T>` or `AsyncIterable<T>` * When combining parsers, if _any_ parser is async, the combined result must be async * Existing sync code should continue to work unchanged The fourth requirement is the tricky one. Consider this: const syncParser = flag("--verbose"); const asyncParser = option("--branch", asyncValueParser); // What's the type of this? const combined = object({ verbose: syncParser, branch: asyncParser }); The combined parser should be async because one of its fields is async. This means we need type-level logic to compute the combined mode. ## Five design options I explored five different approaches, each with its own trade-offs. ### Option A: conditional types with mode parameter Add a mode type parameter to `Parser` and use conditional types: type Mode = "sync" | "async"; type ModeValue<M extends Mode, T> = M extends "async" ? Promise<T> : T; interface Parser<M extends Mode, TValue, TState> { parse(context: ParserContext<TState>): ModeValue<M, ParserResult<TState>>; // ... } The challenge is computing combined modes: type CombineModes<T extends Record<string, Parser<any, any, any>>> = T[keyof T] extends Parser<infer M, any, any> ? M extends "async" ? "async" : "sync" : never; ### Option B: mode parameter with default value A variant of Option A, but place the mode parameter first with a default of `"sync"`: interface Parser<M extends Mode = "sync", TValue, TState> { readonly $mode: M; // ... } The default value maintains backward compatibility—existing user code keeps working without changes. ### Option C: separate interfaces Define completely separate `Parser` and `AsyncParser` interfaces with explicit conversion: interface Parser<TValue, TState> { /* sync methods */ } interface AsyncParser<TValue, TState> { /* async methods */ } function toAsync<T, S>(parser: Parser<T, S>): AsyncParser<T, S>; Simpler to understand, but requires code duplication and explicit conversions. ### Option D: union return types for suggest() only The minimal approach. Only allow `suggest()` to be async: interface Parser<TValue, TState> { parse(context: ParserContext<TState>): ParserResult<TState>; // always sync suggest(context: ParserContext<TState>, prefix: string): Iterable<Suggestion> | AsyncIterable<Suggestion>; // can be either } This addresses the original use case but doesn't help if async `parse()` is ever needed. ### Option E: fp-ts style HKT simulation Use the technique from fp-ts to simulate Higher-Kinded Types: interface URItoKind<A> { Identity: A; Promise: Promise<A>; } type Kind<F extends keyof URItoKind<any>, A> = URItoKind<A>[F]; interface Parser<F extends keyof URItoKind<any>, TValue, TState> { parse(context: ParserContext<TState>): Kind<F, ParserResult<TState>>; } The most flexible approach, but with a steep learning curve. ## Testing the idea Rather than commit to an approach based on theoretical analysis, I created a prototype to test how well TypeScript handles the type inference in practice. I published my findings in the GitHub issue: > Both approaches correctly handle the “any async → all async” rule at the type level. (…) Complex conditional types like `ModeValue<CombineParserModes<T>, ParserResult<TState>>` sometimes require explicit type casting in the implementation. This only affects library internals. The user-facing API remains clean. The prototype validated that Option B (explicit mode parameter with default) would work. I chose it for these reasons: * _Backward compatible_ : The default `"sync"` keeps existing code working * _Explicit_ : The mode is visible in both types and runtime (via a `$mode` property) * _Debuggable_ : Easy to inspect the current mode at runtime * _Better IDE support_ : Type information is more predictable ## How `CombineModes` works The `CombineModes` type computes whether a combined parser should be sync or async: type CombineModes<T extends readonly Mode[]> = "async" extends T[number] ? "async" : "sync"; This type checks if `"async"` is present anywhere in the tuple of modes. If so, the result is `"async"`; otherwise, it's `"sync"`. For combinators like `object()`, I needed to extract modes from parser objects and combine them: // Extract the mode from a single parser type ParserMode<T> = T extends Parser<infer M, unknown, unknown> ? M : never; // Combine modes from all values in a record of parsers type CombineObjectModes<T extends Record<string, Parser<Mode, unknown, unknown>>> = CombineModes<{ [K in keyof T]: ParserMode<T[K]> }[keyof T][]>; ## Runtime implementation The type system handles compile-time safety, but the implementation also needs runtime logic. Each parser has a `$mode` property that indicates its execution mode: const syncParser = option("-n", "--name", string()); console.log(syncParser.$mode); // "sync" const asyncParser = option("-b", "--branch", asyncValueParser); console.log(asyncParser.$mode); // "async" Combinators compute their mode at construction time: function object<T extends Record<string, Parser<Mode, unknown, unknown>>>( parsers: T ): Parser<CombineObjectModes<T>, ObjectValue<T>, ObjectState<T>> { const parserKeys = Reflect.ownKeys(parsers); const combinedMode: Mode = parserKeys.some( (k) => parsers[k as keyof T].$mode === "async" ) ? "async" : "sync"; // ... implementation } ## Refining the API Lucas suggested an important refinement during our discussion. Instead of having `run()` automatically choose between sync and async based on the parser mode, he proposed separate functions: > Perhaps `run(…)` could be automatic, and `runSync(…)` and `runAsync(…)` could enforce that the inferred type matches what is expected. So we ended up with: * `run()`: automatic based on parser mode * `runSync()`: enforces sync mode at compile time * `runAsync()`: enforces async mode at compile time // Automatic: returns T for sync parsers, Promise<T> for async const result1 = run(syncParser); // string const result2 = run(asyncParser); // Promise<string> // Explicit: compile-time enforcement const result3 = runSync(syncParser); // string const result4 = runAsync(asyncParser); // Promise<string> // Compile error: can't use runSync with async parser const result5 = runSync(asyncParser); // Type error! I applied the same pattern to `parse()`/`parseSync()`/`parseAsync()` and `suggest()`/`suggestSync()`/`suggestAsync()` in the facade functions. ## Creating async value parsers With the new API, creating an async value parser for Git branches looks like this: import type { Suggestion } from "@optique/core/parser"; import type { ValueParser, ValueParserResult } from "@optique/core/valueparser"; function gitRef(): ValueParser<"async", string> { return { $mode: "async", metavar: "REF", parse(input: string): Promise<ValueParserResult<string>> { return Promise.resolve({ success: true, value: input }); }, format(value: string): string { return value; }, async *suggest(prefix: string): AsyncIterable<Suggestion> { const { $ } = await import("bun"); const [branches, tags] = await Promise.all([ $`git for-each-ref --format='%(refname:short)' refs/heads/`.text(), $`git for-each-ref --format='%(refname:short)' refs/tags/`.text(), ]); for (const ref of [...branches.split("\n"), ...tags.split("\n")]) { const trimmed = ref.trim(); if (trimmed && trimmed.startsWith(prefix)) { yield { kind: "literal", text: trimmed }; } } }, }; } Notice that `parse()` returns `Promise.resolve()` even though it's synchronous. This is because the `ValueParser<"async", T>` type requires all methods to use async signatures. Lucas pointed out this is a minor ergonomic issue. If only `suggest()` needs to be async, you still have to wrap `parse()` in a Promise. I considered per-method mode granularity (e.g., `ValueParser<ParseMode, SuggestMode, T>`), but the implementation complexity would multiply substantially. For now, the workaround is simple enough: // Option 1: Use Promise.resolve() parse(input) { return Promise.resolve({ success: true, value: input }); } // Option 2: Mark as async and suppress the linter // biome-ignore lint/suspicious/useAwait: sync implementation in async ValueParser async parse(input) { return { success: true, value: input }; } ## What it cost Supporting dual modes added significant complexity to Optique's internals. Every combinator needed updates: * Type signatures grew more complex with mode parameters * Mode propagation logic had to be added to every combinator * Dual implementations were needed for sync and async code paths * Type casts were sometimes necessary in the implementation to satisfy TypeScript For example, the `object()` combinator went from around 100 lines to around 250 lines. The internal implementation uses conditional logic based on the combined mode: if (combinedMode === "async") { return { $mode: "async" as M, // ... async implementation with Promise chains async parse(context) { // ... await each field's parse result }, }; } else { return { $mode: "sync" as M, // ... sync implementation parse(context) { // ... directly call each field's parse }, }; } This duplication is the cost of supporting both modes without runtime overhead for sync-only use cases. ## Lessons learned ### Listen to users, but validate with prototypes My initial instinct was to resist async support. Lucas's persistence and concrete examples changed my mind, but I validated the approach with a prototype before committing. The prototype revealed practical issues (like TypeScript inference limits) that pure design analysis would have missed. ### Backward compatibility is worth the complexity Making `"sync"` the default mode meant existing code continued to work unchanged. This was a deliberate choice. Breaking changes should require user action, not break silently. ### Unified mode vs per-method granularity I chose unified mode (all methods share the same sync/async mode) over per-method granularity. This means users occasionally write `Promise.resolve()` for methods that don't actually need async, but the alternative was multiplicative complexity in the type system. ### Designing in public The entire design process happened in a public GitHub issue. Lucas, Giuseppe, and others contributed ideas that shaped the final API. The `runSync()`/`runAsync()` distinction came directly from Lucas's feedback. ## Conclusion This was one of the more challenging features I've implemented in Optique. TypeScript's type system is powerful enough to encode the “any async means all async” rule at compile time, but getting there required careful design work and prototyping. What made it work: conditional types like `ModeValue<M, T>` can bridge the gap between sync and async worlds. You pay for it with implementation complexity, but the user-facing API stays clean and type-safe. Optique 0.9.0 with async support is currently in pre-release testing. If you'd like to try it, check out PR #70 or install the pre-release: npm add @optique/core@0.9.0-dev.212 @optique/run@0.9.0-dev.212 deno add --jsr @optique/core@0.9.0-dev.212 @optique/run@0.9.0-dev.212 Feedback is welcome!
1 1 0 0
Preview
Nifty Introduces Automatic Check-Ins | Ahoi.dev Nifty introduced automatic check-ins to its project management platform. It's been designed to help keep your teams aligned and your projects running smoothly.

Nifty introduces Automatic Check-Ins 👉 aho.is/8bfb0e89 #projectmanagement #async

0 0 0 0
Preview
坂本龍一『async』──静寂が記憶を揺らす、冬の音像レビュー - 歌声を編む日々 静寂が記憶を揺らす──坂本龍一『async』の音像レビュー。朗読と音の余白を感情軸で読み解き、冬に寄り添う静謐な世界を辿る考察記事。

はてなブログに投稿しました #はてなブログ #async #坂本龍一
冬の静けさに寄り添う一枚。
坂本龍一『async』の“音と沈黙のあいだ”を辿るレビューを書きました。

utagoe-hibi.hatenablog.com/entry/music/...

0 0 0 0
Post image

❓ Not sure when to use Task.WhenAll() versus Parallel.ForEach()? Fati talks about how these differ and what they are each suited for.

🧠 Learn more here: https://bit.ly/3WdAxVO

#dotnet #SoftwareDevelopment #ParallelProcessing #async

0 0 0 0
Preview
Go Routines vs Rust Async A Deep Technical Comparison That Actually Explains the Trade-offs

#GoLang Routines vs #Rust #Async

#Concurrency

medium.com/@yalovoy/go-...

1 0 0 0
Post image

Au travers d’un triple programme très ambitieux, Potemkine fait fi des clichés qui entouraient le célèbre compositeur.

Le test des Blurays : regard-critique.fr/home-cinema/...

#Bluray #RyuichiSakamoto #TokyoMelody #Async #Opus #Documentaire #Musique #PotemkineFilms

1 0 0 0
Just a moment...

Explore why #async code in C# might be slower than expected, with real examples illustrating common pitfalls. Optimize your code with insights from this analysis and boost performance. #dotnet

0 0 0 0