maegul

joined 2 years ago
MODERATOR OF
[–] maegul@lemmy.ml 6 points 10 months ago (6 children)

I remember hearing rumours during the role out that tech employees were found asking for help on forums in ways that weren’t promising for the health and talent of the people building it.

But yea, it’s the embarrassment of this sort of stuff that must be masking the real financials of PT and how viable a free system would be.

[–] maegul@lemmy.ml 3 points 10 months ago (9 children)

Yea I’ve kept track of how often I’ve encountered inspectors, and most of the time it’d be worth it to not get the ticket or not tap on. Sometimes though I’ve noticed an increase in the number of inspectors that would definitely shift the equation. Also train stations with gates complicate the matter.

I don’t know if it’s out there, but I’d personally like to know how the finances come out for making PT free. You obviously lose revenue, but also all the overhead of paying for inspectors and for all of the ticketing infrastructure. I also wonder if the part that makes the finances work is all the fines collected, which would be pretty fucking shithouse if true.

[–] maegul@lemmy.ml 13 points 10 months ago

The catch is that the whole system is effectively centralised on BlueSky backend services (basically the relay). So while the protocol may be standardised and open, and interpreted with decentralised components, they’ll control the core service. Which means they can unilaterally decide to introduce profitable things like ads and charging for features.

The promise of the system though is that it provides for various levels of independence that can all connect to each other, so people with different needs and capabilities can all find their spot in the ecosystem. Whether that happens is a big question. Generally I’d say I’m optimistic about the ideas and architecture, but unsure about whether the community around it will get it to what I think it should be.

[–] maegul@lemmy.ml 1 points 10 months ago

I don’t recall seeing ads in Netflix

Didn’t they start an ads tier ages ago?

And yea I wasn’t commenting on the shows, just that I’d been away from Netflix for so long I haven’t even had a chance to see any of these.

[–] maegul@lemmy.ml 15 points 10 months ago (2 children)

It's funny how long I've been off of netflix ... I haven't seen any of these (even despite wanting to watch Stranger Things S4 just to finish it).

Something about netflix just burned me. I tried subscribing again after they introduced ads and the amount clunkiness and incompatibility they'd introduced through that basically burned me for good. I subscribe to some things here and there but I'd lost track of how much netflix is basically off of my radar.

[–] maegul@lemmy.ml 2 points 10 months ago

Suspicion is totally fair re BlueSky IMO. The system they’ve design seems to me (and others AFAICT) to have the potential to include interconnected components or sections with various degrees of independence.

The elephant in the room, which I point out on BlueSky whenever I can, is that no one seems to really be trying to build the hard parts of that out. Which is a shame because it could be interesting.

EG, there’s a chance that a hybridised system running both BlueSky’s protocol and the fediverse’s could be viable and quite useful. Add to that the integration with some E2EE, and it finally feels like an actual attempt at building something new for the modern internet.

Fortunately there is some noise around these ideas, so hopefully their system can outlast their finances. But yea, a rug pull is definitely not out of the question.

[–] maegul@lemmy.ml 1 points 10 months ago

Oh I’m with you there! And otherwise totally understandable.

[–] maegul@lemmy.ml 1 points 10 months ago

I'm not close to Gaiman or the fandom or the story, at all ... but I did manage to pickup on some of the whisper network through mastodon, where I read a couple of posts or blog posts by people who seemed to be close to the fandom and even be friends with him (in one case). The vibe was very much he's likely a womanising guy and has been his whole life and likely has standards "below" what most would consider normal. There also seemed to be a bit of introspection about how much the fandom around him was kinda culty and that that clearly enabled whatever he did, as a common enough pattern seemed to be that someone would go through something unseemly, but because it was Gaiman and all their friends were fans, they presumed it was them or their fault and never wanted to open up about it.

Like I said though, this is essentially gossip, so don't take it seriously or anything (also I don't have any links or anything).

[–] maegul@lemmy.ml 2 points 10 months ago (2 children)

I think for python tooling the choice is Python Vs Rust. C isn’t in the mix either.

That seems fair. Though I recall Mumba making headway (at least in the anaconda / conda space) and it is a C++ project. AFAIU, their underlying internals have now been folded into conda, which would mean a fairly popular, and arguably successful portion of the tooling ecosystem (I tended to reach for conda and recommend the same to many) is reliant on a C++ foundation.

On the whole, I imagine this is a good thing as the biggest issue Conda had was performance when trying to resolve packaging environments and versions.

So, including C++ as part of C (which is probably fair for the purposes of this discussion), I don't think C is out of the mix either. Should there ever be a push to fold something into core python, using C would probably come back into the picture too.


I think there’s a survivor bias going on here.

Your survivorship bias point on rust makes a lot of sense ... there's certainly some push back against its evangelists and that's fair (as someone who's learnt the language a bit). Though I think it's fair to point out the success stories are "survivorship" stories worth noting.

But it seems we probably come back to whether fundamental tooling should be done in python or a more performant stack. And I think we just disagree here. I want the tooling to "just work" and work well and personally don't hold nearly as much interest in being able to contribute to it as I do any other python project. If that can be done in python, all the better, but I'm personally not convinced (my experience with conda, while it was a pure python project, is informative for me here)

Personally I think python should have paid more attention to both built-in tooling (again, I think it's important to point out how much of this is simply Guido's "I don't want to do that" that probably wouldn't be tolerated these days) and built-in options for more performance (by maybe taking pypy and JIT-ing more seriously).

Maybe the GIL-less work and more performant python tricks coming down the line will make your argument more compelling to people like me.

(Thanks very much for the chat BTW, I personally appreciate your perspective as much as I'm arguing with you)

[–] maegul@lemmy.ml 1 points 10 months ago

Funny to only ask/report on Zendaya if she’d come back after Messiah.

[–] maegul@lemmy.ml 5 points 10 months ago

Yep! And likely the lesson to take from it for Python in general. The general utility of a singular foundation that the rest of the ecosystem can be built out from.

Even that it’s compiled is kinda beside the point. There could have been a single Python tool written in Python and bundled with its own Python runtime. But Guido never wanted to do projects and package management and so it’s been left as the one battery definitely not included.

[–] maegul@lemmy.ml 6 points 10 months ago (4 children)

I feel like this is conflating two questions now.

  1. Whether to use a non-Python language where appropriate
  2. Whether to use rust over C, which is already heavily used and fundamental in the ecosystem (I think we can put cython and Fortran to the side)

I think these questions are mostly independent.

If the chief criterion is accessibility to the Python user base, issue 2 isn’t a problem IMO. One could argue, as does @eraclito@feddit.it in this thread, that in fact rust provides benefits along these lines that C doesn’t. Rust being influenced by Python adds weight to that. Either way though, people like and want to program in rust and have provided marked success so far in the Python ecosystem (as eraclito cites). It’s still a new-ish language, but if the core issue is C v Rust, it’s probably best to address it on those terms.

 

Intro

Having read through the macros section of "The Book" (Chapter 19.6), I thought I would try to hack together a simple idea using macros as a way to get a proper feel for them.

The chapter was a little light, and declarative macros (using macro_rules!), which is what I'll be using below, seemed like a potentially very nice feature of the language ... the sort of thing that really makes the language malleable. Indeed, in poking around I've realised, perhaps naively, that macros are a pretty common tool for rust devs (or at least more common than I knew).

I'll rant for a bit first, which those new to rust macros may find interesting or informative (it's kinda a little tutorial) ... to see the implementation, go to "Implementation (without using a macro)" heading and what follows below.

Using a macro

Well, "declarative macros" (with macro_rules!) were pretty useful I found and easy to get going with (such that it makes perfect sense that they're used more frequently than I thought).

  • It's basically pattern matching on arbitrary code and then emitting new code through a templating-like mechanism (pretty intuitive).
  • The type system and rust-analyzer LSP understand what you're emitting perfectly well in my experience. It really felt properly native to rust.

The Elements of writing patterns with "Declarative macros"

Use macro_rules! to declare a new macro

Yep, it's also a macro!

Create a structure just like a match expression

  • Except the pattern will match on the code provided to the new macro
  • ... And uses special syntax for matching on generic parts or fragments of the code
  • ... And it returns new code (not an expression or value).

Write a pattern as just rust code with "generic code fragment" elements

  • You write the code you're going to match on, but for the parts that you want to capture as they will vary from call to call, you specify variables (or more technically, "metavariables").
    • You can think of these as the "arguments" of the macro. As they're the parts that are operated on while the rest is literally just static text/code.
  • These variables will have a name and a type.
  • The name as prefixed with a dollar sign $ like so: $GENERIC_CODE.
  • And it's type follows a colon as in ordinary rust: $GENERIC_CODE:expr
    • These types are actually syntax specifiers. They specify what part of rust syntax will appear in the fragment.
    • Presumably, they link right back into the rust parser and are part of how these macros integrate pretty seamlessly with the type system and borrow checker or compiler.
    • Here's a decent list from rust-by-example (you can get a full list in the rust reference on macro "metavariables"):
      • block
      • expr is used for expressions
      • ident is used for variable/function names
      • item
      • literal is used for literal constants
      • pat (pattern)
      • path
      • stmt (statement)
      • tt (token tree)
      • ty (type)
      • vis (visibility qualifier)

So a basic pattern that matches on any struct while capturing the struct's name, its only field's name, and its type would be:

macro_rules! my_new_macro {
    (
        struct $name:ident {
            $field:ident: $field_type:ty
        }
    )
}

Now, $name, $field and $field_type will be captured for any single-field struct (and, presumably, the validity of the syntax enforced by the "fragment specifiers").

Capture any repeated patterns with + or *

  • Yea, just like regex
  • Wrap the repeated pattern in $( ... )
  • Place whatever separating code that will occur between the repeats after the wrapping parentheses:
    • EG, a separating comma: $( ... ),
  • Place the repetition counter/operator after the separator: $( ... ),+

Example

So, to capture multiple fields in a struct (expanding from the example above):

macro_rules! my_new_macro {
    (
        struct $name:ident {
            $field:ident: $field_type:ty,
            $( $ff:ident : $ff_type: ty),*
        }
    )
}
  • This will capture the first field and then any additional fields.
    • The way you use these repeats mirrors the way they're captured: they all get used in the same way and rust will simply repeat the new code for each repeated captured.

Writing the emitted or new code

Use => as with match expressions

  • Actually, it's => { ... }, IE with braces (not sure why)

Write the new emitted code

  • All the new code is simply written between the braces
  • Captured "variables" or "metavariables" can be used just as they were captured: $GENERIC_CODE.
  • Except types aren't needed here
  • Captured repeats are expressed within wrapped parentheses just as they were captured: $( ... ),*, including the separator (which can be different from the one used in the capture).
    • The code inside the parentheses can differ from that captured (that's the point after all), but at least one of the variables from the captured fragment has to appear in the emitted fragment so that rust knows which set of repeats to use.
    • A useful feature here is that the repeats can be used multiple times, in different ways in different parts of the emitted code (the example at the end will demonstrate this).

Example

For example, we could convert the struct to an enum where each field became a variant with an enclosed value of the same type as the struct:

macro_rules! my_new_macro {
    (
        struct $name:ident {
            $field:ident: $field_type:ty,
            $( $ff:ident : $ff_type: ty),*
        }
    ) => {
        enum $name {
            $field($field_type),
            $( $ff($ff_type) ),*
        }
    }
}

With the above macro defined ... this code ...

my_new_macro! {
    struct Test {
        a: i32,
        b: String,
        c: Vec<String>
    }
}

... will emit this code ...

enum Test {
    a(i32),
    b(String),
    c(Vec<String>)
}

Application: "The code" before making it more efficient with a macro

Basically ... a simple system for custom types to represent physical units.

The Concept (and a rant)

A basic pattern I've sometimes implemented on my own (without bothering with dependencies that is) is creating some basic representation of physical units in the type system. Things like meters or centimetres and degrees or radians etc.

If your code relies on such and performs conversions at any point, it is way too easy to fuck up, and therefore worth, IMO, creating some safety around. NASA provides an obvious warning. As does, IMO, common sense and experience: most scientists and physical engineers learn the importance of "dimensional analysis" of their calculations.

In fact, it's the sort of thing that should arguably be built into any language that takes types seriously (like eg rust). I feel like there could be an argument that it'd be as reasonable as the numeric abstractions we've worked into programming??

At the bottom I'll link whatever crates I found for doing a better job of this in rust (one of which seemed particularly interesting).

Implementation (without using a macro)

The essential design is (again, this is basic):

  • A single type for a particular dimension (eg time or length)
  • Method(s) for converting between units of that dimension
  • Ideally, flags or constants of some sort for the units (thinking of enum variants here)
    • These could be methods too
#[derive(Debug)]
pub enum TimeUnits {s, ms, us, }

#[derive(Debug)]
pub struct Time {
    pub value: f64,
    pub unit: TimeUnits,
}

impl Time {
    pub fn new<T: Into<f64>>(value: T, unit: TimeUnits) -> Self {
        Self {value: value.into(), unit}
    }

    fn unit_conv_val(unit: &TimeUnits) -> f64 {
        match unit {
            TimeUnits::s => 1.0,
            TimeUnits::ms => 0.001,
            TimeUnits::us => 0.000001,
        }
    }

    fn conversion_factor(&self, unit_b: &TimeUnits) -> f64 {
        Self::unit_conv_val(&self.unit) / Self::unit_conv_val(unit_b)
    }

    pub fn convert(&self, unit: TimeUnits) -> Self {
        Self {
            value: (self.value * self.conversion_factor(&unit)),
            unit
        }
    }
}

So, we've got:

  • An enum TimeUnits representing the various units of time we'll be using
  • A struct Time that will be any given value of "time" expressed in any given unit
  • With methods for converting from any units to any other unit, the heart of which being a match expression on the new unit that hardcodes the conversions (relative to base unit of seconds ... see the conversion_factor() method which generalises the conversion values).

Note: I'm using T: Into<f64> for the new() method and f64 for Time.value as that is the easiest way I know to accept either integers or floats as values. It works because i32 (and most other numerics) can be converted lossless-ly to f64.

Obviously you can go further than this. But the essential point is that each unit needs to be a new type with all the desired functionality implemented manually or through some handy use of blanket trait implementations

Defining a macro instead

For something pretty basic, the above is an annoying amount of boilerplate!! May as well rely on a dependency!?

Well, we can write the boilerplate once in a macro and then only provide the informative parts!

In the case of the above, the only parts that matter are:

  • The name of the type/struct
  • The name of the units enum type we'll use (as they'll flag units throughout the codebase)
  • The names of the units we'll use and their value relative to the base unit.

IE, for the above, we only need to write something like:

struct Time {
    value: f64,
    unit: TimeUnits,
    s: 1.0,
    ms: 0.001,
    us: 0.000001
}

Note: this isn't valid rust! But that doesn't matter, so long as we can write a pattern that matches it and emit valid rust from the macro, it's all good! (Which means we can write our own little DSLs with native macros!!)

To capture this, all we need are what we've already done above: capture the first two fields and their types, then capture the remaining "field names" and their values in a repeating pattern.

Implementation of the macro

The pattern

macro_rules! unit_gen {
    (
        struct $name:ident {
            $v:ident: f64,
            $u:ident: $u_enum:ident,
            $( $un:ident : $value:expr ),+
        }
    )
}
  • Note the repeating fragment doesn't provide a type for the field, but instead captures and expression expr after it, despite being invalid rust.

The Full Macro

macro_rules! unit_gen {
    (
        struct $name:ident {
            $v:ident: f64,
            $u:ident: $u_enum:ident,
            $( $un:ident : $value:expr ),+
        }
    ) => {
        #[derive(Debug)]
        pub struct $name {
            pub $v: f64,
            pub $u: $u_enum,
        }
        impl $name {
            fn unit_conv_val(unit: &$u_enum) -> f64 {
                match unit {
                $(
                    $u_enum::$un => $value
                ),+
                }
            }
            fn conversion_factor(&self, unit_b: &$u_enum) -> f64 {
                Self::unit_conv_val(&self.$u) / Self::unit_conv_val(unit_b)
            }
            pub fn convert(&self, unit: $u_enum) -> Self {
                Self {
                    value: (self.value * self.conversion_factor(&unit)),
                    unit
                }
            }
        }
        #[derive(Debug)]
        pub enum $u_enum {
            $( $un ),+
        }
    }
}

Note the repeating capture is used twice here in different ways.

  • The capture is: $( $un:ident : $value:expr ),+

And in the emitted code:

  • It is used in the unit_conv_val method as: $( $u_enum::$un => $value ),+
    • Here the ident $un is being used as the variant of the enum that is defined later in the emitted code
    • Where $u_enum is also used without issue, as the name/type of the enum, despite not being part of the repeated capture but another variable captured outside of the repeated fragments.
  • It is then used in the definition of the variants of the enum: $( $un ),+
    • Here, only one of the captured variables is used, which is perfectly fine.

Usage

Now all of the boilerplate above is unnecessary, and we can just write:

unit_gen!{
    struct Time {
        value: f64,
        unit: TimeUnits,
        s: 1.0,
        ms: 0.001,
        us: 0.000001
    }
}

Usage from main.rs:

use units::Time;
use units::TimeUnits::{s, ms, us};

fn main() {

    let x = Time{value: 1.0, unit: s};
    let y = x.convert(us);

    println!("{:?}", x);
    println!("{:?}", x);
}

Output:

Time { value: 1.0, unit: s }
Time { value: 1000000.0, unit: us }
  • Note how the struct and enum created by the emitted code is properly available from the module as though it were written manually or directly.
  • In fact, my LSP (rust-analyzer) was able to autocomplete these immediately once the macro was written and called.

Crates for unit systems

I did a brief search for actual units systems and found the following

dimnesioned

dimensioned documentation

  • Easily the most interesting to me (from my quick glance), as it seems to have created the most native and complete representation of physical units in the type system
  • It creates, through types, a 7-dimensional space, one for each SI base unit
  • This allows all possible units to be represented as a reduction to a point in this space.
    • EG, if the dimensions are [seconds, meters, kgs, amperes, kelvins, moles, candelas], then the Newton, m.kg / s^2 would be [-2, 1, 1, 0, 0, 0, 0].
  • This allows all units to be mapped directly to this consistent representation (interesting!!), and all operations to then be done easily and systematically.

Unfortunately, I'm not sure if the repository is still maintained.

uom

uom documentation

  • This might actually be good too, I just haven't looked into it much
  • It also seems to be currently maintained

F#

Interestingly, F# actually has a system built in!

 

I looked around and struggled to find out what it does?

My guess would be that it notifies you of when new posts are made to communities you subscribe to. But that sounds like a lot, so I'm really not sure.

Otherwise, is it me or does the wording here not speak for itself?

 

Report showing the shift in AI sentiment in the industry. Relatively in depth and probably coming from a pro-AI bias (I haven’t read the whole thing).

Last graph at the bottom was what I was linked to. Clearly shows a corner turning where those closer to the actual “product” are now sceptical while management (the last category in the chart) are more committed.

 

Generally, the lens I've come to criticise any/all fediverse projects is how well they foster community building. One reason why I like and "advocate" for the lemmy/threadiverse side of things is precisely because of this and how the centrality of the community/sub/group is a good way of organising social media (IMO).

Also, because of that, I recently came to be skeptical of the effects that the "All" feed can have. I didn't even realise that people relied mostly on the All feed until recently.

I think I've reached the point now of being against it (at least tentatively). I know, it's a staple and there's no way it's going away. And I know it's useful.

But thinking about the feature set, through the community building lens, I think it'd be fair to say that things are out of balance: they don't promote community building enough while also providing the All feed which dissolves community building.

Not really a criticism of the developers ... AFAIU, the All feed is easier to implement than any other community building feature ... and it's expected from reddit (though it isn't normal on forums AFAICT, which is maybe worth considering for anyone happy to reassess what about reddit is retained and what isn't).

But still, I can imagine a platform that is more focused on communities:

  • Community explorer tool built in.
    • Could even be a substitute for an All feed ... where you can browse through various communities you don't know about and see what they've posted recently
  • Multi-communities (long time coming by now for many I'd say)
    • Could even be part of the community explorer tool where you can create on-the-fly multi-communities to see their posts in a temporary feed
  • Private and local only communities (already here on lemmy and coming for private communities)
  • Post visibility options for Public communities (IE, posts that opt-in private)
  • More flexible notifications for various things/events that happen within a community
  • Wikis
  • Chat interface
    • I'm thinking this is pretty viable given that Lemmy used to use a web-socket auto-updating design ... add that to the flat chat view and you've got a chat room. There are resource issues, so limiting them to one per community or 6hrs per week per community or something would probably be necessary.

A possibly interesting and frustrating aspect of all of these suggestions/ideas above is I can see their federation being problematic or difficult ... which raises the issue of whether there's serious tension between platform design and protocol capabilities.

 

There are also some gems in there about how old and constant underplaying the amount of VFX in a film is.

From the video, Stand By Me had a VFX shot (the train bridge scene, of course) but no one was allowed to talk about that. And of course The Fugitive train crash scene had to have "real trains" even though it's all mostly miniatures.

 

The post mentions data or research on how rust usage in is resulting in fewer errors in comparison to C. Anyone aware of good sources for that?

 

Lets try this experiment

Start watching Big Trouble in Little China at 7pm, Central Time, USA (as precisely as you can) ... and come here for live posts as you watch!

This is ~24 hours from the time of this post

Here's a timeanddate.com link to the timezone(s) involved.


@AVincentInSpace@pawb.social jas volunteered to run a live watch on cytube the day afterward (approx 7pm Monday). Posts and links should be coming (and see comments below on the idea).

 

Lets try this experiment

Start watching Big Trouble in Little China at 7pm, Central European Summer Time (as precisely as you can) ... and come here for live posts as you watch!

This is ~17 hours from the time of this post

Here's a timeanddate.com link to the timezone(s) involved.

 

Lets try this experiment

Start watching Big Trouble in Little China at 7pm, Australian Eastern Standard Time (as precisely as you can) ... and come here for live posts as you watch!

This is ~9 hours from the time of this post

Here's a timeanddate.com link to the timezone(s) involved.


A little short notice, it's more of an experiment this time. I've got my DVD and will most likely pop in.

 

A supplement to typical tutorials that caters to C programmers interested in learning how to be unsafe upfront.

Seems good from a quick skim. Also seems that the final lesson is that starting on the safe/happy path in rust doesn’t have to cost performance if you know what you’re doing.

 

Just a quick riff/hack on whether it'd be hard to make a collect() method that "collected" into a Vec without needing any turbofish (see, if you're interested, my prior post on the turbofish.

Some grasp of traits and iteration is required to comfortably get this ... though it might be a fun dive even if you're not

Background on collect

The implementation of collect is:

fn collect<B: FromIterator<Self::Item>>(self) -> B
where
    Self: Sized,
{
    FromIterator::from_iter(self)
}

The generic type B is bound by FromIterator which basically enables a type to be constructed from an Iterator. In other words, collect() returns any type that can be built from an interator. EG, Vec.

The reason the turbofish comes about is that, as I said above, it returns "any type" that can be built from an iterator. So when we run something like:

let z = [1i32, 2, 3].into_iter().collect();

... we have a problem ... rust, or the collect() method has no idea what type we're building/constructing.

More specifically, looking at the code for collect, in the call of FromIterator::form_iter(self), which is calling the method on the trait directly, rust has no way to determine which implementation of the trait to use. The one on Vec or HashMap or String etc??

Thus, the turbofish syntax specifies the generic type B which (somehow through type inference???) then determines which implementation to use.

let z = [1i32, 2, 3].into_iter().collect::<Vec<_>>();

IE: Use the implementation on Vec!

Why not just use Vec?

I figure Vec is used so often as the type for collecting an Iterator that it could be nice to have a convenient method.

The docs even hint at this by suggesting that calling the FromIterator::from_iter() method directly from the desired type (eg Vec) can be more readable (see FromIterator docs).

EG ... using collect:

let d = [1i32, 2, 3];
let x = d.iter().map(|x| x + 100).collect::<Vec<_>>();

Using Vec::from_iter()

let y = Vec::from_iter(d.iter().map(|x| x + 100));

As Vec is always in the prelude (IE, it's always available), using from_iter clearly seems like a nicer option here.

But you lose method chaining! So ... how about a method on Iterator, like collect but for Vec specifically? How would you make that and is it hard??

Making collect_vec()

It's not hard actually

  • Define a trait, CollectVec that defines a method collect_vec which returns Vec<Self::Item>
  • Make this a "sub-trait" of Iterator (or, make Iterator the "supertrait") so that the Iterator::collect() method is always available
  • Implement CollectVec for all types that implement Iterator by just calling self.collect() ... the type inference will take care of the rest, because it's clear that a Vec will be used.
trait CollectVec: Iterator {
    fn collect_vec(self) -> Vec<Self::Item>;
}

impl<I: Iterator> CollectVec for I {
    fn collect_vec(self) -> Vec<Self::Item> {
        self.collect()
    }
}

With this you can then do the following:

let d = [1i32, 2, 3];
let d2 = d.iter().map(|x| x + 1).collect_vec();

Don't know about you, but implementing such methods for the common collection types would suit me just fine ... that turbofish is a pain to write ... and AFAICT this isn't inconsistent with rust's style/design. And it's super easy to implement ... the type system handles this issue very well.

 

cross-posted from: https://lemmy.ml/post/18372054

Today, we sat down and reviewed NeoDB, a Fediverse review system that lets you track books, movies, music, tv shows, games, podcasts, and more. There's some really incredible ideas beneath the surface.

view more: ‹ prev next ›