Mirodir

joined 2 years ago
[–] Mirodir@discuss.tchncs.de 109 points 11 months ago* (last edited 11 months ago) (3 children)

Sure. You have to solve it from inside out:

  • not()....See comment below for this one, I was tricked ~~is a base function that negates what's inside (turning True to False and vice versa) giving it no parameter returns "True" (because no parameter counts as False)~~
  • str(x) turns x into a string, in this case it turns the boolean True into the text string 'True'
  • min(x) returns the minimal element of an iterable. In this case the character 'T' because capital letters come before non-capital letters, otherwise it would return 'e' (I'm not entirely sure if it uses unicode, ascii or something else to compare characters, but usually capitals have a lower value than non-capitals and otherwise in alphabetical order ascending)
  • ord(x) returns the unicode number of x, in this case turning 'T' into the integer 84
  • range(x) creates an iterable from 0 to x (non-inclusive), in this case you can think of it as the list [0, 1, 2, ....82, 83] (it's technically an object of type range but details...)
  • sum(x) sums up all elements of a list, summing all numbers between 0 and 84 (non-inclusive) is 3486
  • chr(x) is the inverse of ord(x) and returns the character at position x, which, you guessed it, is 'ඞ' at position 3486.

The huge coincidental part is that ඞ lies at a position that can be reached by a cumulative sum of integers between 0 and a given integer. From there on it's only a question of finding a way to feed that integer into chr(sum(range(x)))

[–] Mirodir@discuss.tchncs.de 2 points 11 months ago (1 children)

From experience with the beta and memory, your wife (and you) will be able to choose which version to play. Either yours with a ton of DLC or hers with none. You should both be able to use the version with all DLC, but not at the same time.

It's been a while since we tested this though so things might have changed, including my memory...

[–] Mirodir@discuss.tchncs.de 3 points 11 months ago

Those are typcially the ones without any prefix

With the notable exception of the kg...for some inexplicable reason.

[–] Mirodir@discuss.tchncs.de 9 points 11 months ago (2 children)

If you wanna see a language model (almost) exclusively trained on 4chan, here you go.

[–] Mirodir@discuss.tchncs.de 2 points 11 months ago (1 children)

Presumably. Wouldn't take much to fake that though.

[–] Mirodir@discuss.tchncs.de 29 points 11 months ago* (last edited 11 months ago) (2 children)

after leaving can’t join another for a year

Can you fix this? There was enough misinformation floating around about this already when this feature went into beta.

Adults can leave a family at any time, however, they will need to wait 1 year from when they joined the previous family to create or join a new family.

it should say something like: "After joining, can't join another for a year"

[–] Mirodir@discuss.tchncs.de 2 points 11 months ago

Despite all that, there was the Warlock who Ninja'd the Sword dropping in Naxx on the final day. Fun times. Now that we're slowly approaching Classic MoP we'll finally see if the masterplan worked out too.

[–] Mirodir@discuss.tchncs.de 6 points 11 months ago

When were talking about teaching kids the alphabet we need to train both individual and applied letters

This is only slightly related but I once met a young (USAmerican) adult who thought the stripy horse animal's name was pronounced zed-bra in British English and it was really hard to convince her otherwise. In her mind zebra was strongly connected to Z-bra, so of course if someone was to pronounce the letter "zed" it would turn into "zed-bra" and not just into "zeh-bra".

[–] Mirodir@discuss.tchncs.de 6 points 11 months ago (2 children)

That data is also publicly available (of course), so a model could be trained on it. I'd love to say I'd doubt Google/YouTube would ever do that, but at this point nothing would surprise me.

[–] Mirodir@discuss.tchncs.de 14 points 11 months ago (1 children)

Usually, if there's a scam, someone's making money off it. This is them. They want to keep making money.

[–] Mirodir@discuss.tchncs.de 8 points 11 months ago

I trained the generative models all from scratch. Pretrained models are not that helpful when it's important to accurately capture very domain specific features.

One of the classifiers I tried was based on zoobot with a custom head. Assuming the publications around zoobot are truthful, it was trained exclusively on similar data from a multitude of different sky surveys.

[–] Mirodir@discuss.tchncs.de 24 points 11 months ago (2 children)

Does it? I worked on training a classifier and a generative model on freely available galaxy images taken by Hubble and labelled in a citizen science approach. Where's the theft?

view more: ‹ prev next ›