• 0 Posts
  • 15 Comments
Joined 1 year ago
cake
Cake day: June 29th, 2023

help-circle
  • Ironically enough Aurora city water consistently wins awards for it’s quality lol.

    I think the legitimate reason is that Aurora is a physically massive city, has lower housing costs than the rest of the metro area, and Denver has a habit of forcing its homeless population out and into Aurora. The police department is also an absolute good ole boys club who are all terrified of city residents to the point where they drive unmarked/undercover vehicles by default (at least it seems that way, I see so few marked police cars but whenever there’s a collection of cop cars with lights going the majority are the undercover)

    Sauce: Current Aurora, CO resident. It’s not all bad


  • Embedded systems run into this a lot, especially on low level communication busses. It’s pretty common to have a comm bus architecture where there is just one device that is supposed to be in control of both the communication happening on the bus and what the other devices are actually doing. SPI and I2C are both examples of this, but both of those busses have architectures where there isn’t one single controller or that the devices have some other way to arbitrate who is talking on the bus. It’s functionally useful to have a term to differentiate between the two.

    I’ve seen Master/Servant used before which in my experience just trips people up and doesn’t really address the cultural reason for not using the terms.

    Personally I’m a fan of MIL-STD-1553 terminology, Bus Controller and Remote Terminal, but the letters M and S are heavily baked into so much literature and designs at this point (eg MISO and MOSI) that entirely swapping them out will be costly and so few people will do it, so it sticks around


  • MajorasMaskForever@lemmy.worldtoProgramming@programming.dev...
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    5 months ago

    Ada

    It has a lot of really nice features for creating data types and has amazing static analysis during compile time.

    But all the tooling around it is absolute crap making using the language unbearable and truly awful. If it had better tooling I could see that it would have taken a decent chunk of development away from C and C++


  • As someone who is in the aerospace industry and has dealt with safety critical code with NASA oversight, it’s a little disingenuous to pin NASA’s coding standards entirely on attempting to make things memory safe. It’s part of it, yeah, but it’s a very small part. There are a ton of other things that NASA is trying to protect for.

    Plus, Rust doesn’t solve the underlying problem that NASA is looking to prevent in banning the C++ standard library. Part of it is DO-178 compliance (or lack thereof) the other part is that dynamic memory has the potential to cause all sorts of problems on resource constrained embedded systems. Statically analyzing dynamic memory usage is virtually impossible, testing for it gets cost prohibitive real quick, it’s just easier to blanket statement ban the STL.

    Also, writing memory safe code honestly isn’t that hard. It just requires a different approach to problem solving, that just like any other design pattern, once you learn and get used to it, is easy.



  • So many people forget that while they understand how to use a Linux terminal and how Linux on a high level works, not everyone does. Plus, learning all of that takes time, effort, and tenacity, which not everyone is willing to do. Linus’s whole conclusion was that as long as that learning curve exists and as long as it’s that easy to shoot yourself in the foot, Linux desktop just isn’t viable for a lot of people.

    But Linus has done a lot of public fuck ups therefore everything he says must be inherently wrong.



  • The issue is that with ongoing service across time, the longer the service is being used the more it costs Kia. The larger the time boxes Kia uses the bigger the number is and the more you’re going to scare off customers.

    Using Kias online build and price, looks like the most expensive Telluride you can get right now is $60k MSRP, cheapest at 30k

    Let’s assume Kia estimates average lifetime of a Telluride to be 20 years so they create an option to purchase this service one time for the “lifetime” of the vehicle. Taking in good faith the pricing Kia has listed, using that $150 annual package, and assuming that price goes up every year at a rate of 10% (what Netflix, YouTube, etc have been doing) across those twenty years you’re looking at around $8.5k option. At the top trim thats still 14% extra that is going to make some buyers hesitant, at the base model that’s 28% more expensive.

    Enough buyers will scoff at that so Kia can either ditch the idea entirely as they’ll lose money on having to pay for the initial development and never make their money back, or they find some way to repackage that cost and make it look like something that buyers are willing to deal with.

    To me the bigger issue is the cost of the service vs what you’re getting. Server time + dev team + mobile data link cannot be costing Kia more than a few million annually, mid to upper hundred K is more likely so they must not be expecting that many people to actually be paying for any of this


  • It’s IEEE misinterpreting the guys original paper.

    https://liuyang12.github.io/proj/privacy_dual_imaging/ (can’t find the full paper, but here’s the abstract at least)

    The paper author straight up says the light sensor is impractical to use as an attack vector, but when you use it in conjunction with other sensors you might be able to gleam more information than most might think. It leaves me with question of what other sensors can you combine to start getting behavioral information that is a security threat?

    I’ll say it worked for me. I read the IEEE headline, called bullshit, dug into it and yeah you can only get a tiny bit of information that you have to stretch pretty far to get useful conclusions from… But it’s more than the zero I initially thought. So props to the paper author, he met his goal. IEEE wanted sensationalized clicks, which they too unfortunately got.


  • Yup your right, I was wrong. Valve keeps the copyright regardless.

    Dolphin situation was different though. https://dolphin-emu.org/blog/2023/07/20/what-happened-to-dolphin-on-steam/

    Valve only ever insisted that Nintendo had to give Dolphin permission to distribute since Valve was afraid of a potential DMCA coming from Nintendo if Nintendo thought that the encryption keys were IP illegally being redistributed. Since Nintendo says emulators are illegal everywhere but a courtroom, Dolphin team knew that they’d never get an ok. Valve probably knew that but didn’t care enough to help fight that legal battle.

    I’m not sure Valve cares about brownie points with Nintendo. The Steam Deck is a direct competitor against the Switch, Valve has done nothing to curtail the use of Switch emulators on Deck, and the work Valve has been doing makes using a switch emulator a better experience.

    This whole thing only makes sense if Valve wanted to protect their IP. Involving Nintendo really does sound like blame shifting without having to actually go to court


  • I’m with you on the first part. It makes no sense for Valve to do this. Using LibUltra or not, Nintendo has been relatively lax on people creating new code for the N64. At least to my recollection only in cases where Nintendo felt their IP was directly being threatened did they try and take down fan projects. Even then they heavily rely on the redistribution of Nintendo IP to take things down. Admittedly I have only seen others talking about the Portal 64 project using LibUltra but even so that’s Nintendo’s fight, not Valve’s.

    I don’t see how Valve could possibly be afraid of getting sued here by Nintendo, it doesn’t make sense. Valve did not create it, nor distribute, advertise, or aid in any way. IANAL but I don’t see how Valve could possibly be listed as a party to the lawsuit unless Nintendo lawyers agreed with Valve lawyers to go after this guy for IP theft.

    TBH I see this more as Valve seeing that with a project this publicly known, if they don’t defend their IP here they’ll lose any future copyright claims and want to prevent it. They also see an opportunity here, blame Nintendo who won’t flinch it at since they get labelled legal bad guys all the time, no real dent to their reputation while saving Valve’s internet golden child perception. Valve would never do something like this so it MUST be Nintendo’s fault. Based on the comments in this thread and I’ve seen else where, that seems like a good assumption. Nintendo takes the heat while Valve protects their IP.


  • In pure C things are a bit different from what you describe.

    Declaration has (annoyingly) multiple definitions depending on the context. The most basic one is when you are creating an instance of a variable, you are telling the compiler that you want a variable with symbol name X, data type Y, and qualifiers A,B and C. During compilation the compiler will read that and start reserving memory for the linker to assign later. These statements are always in the form of “qualifiers data_type symbol;”

    Function declaration is a bit different, here you’re telling the compiler “hey you’re going to see this function show up later. Here are the types for arguments and return. I pinky swear promise you’ll get a definition somewhere else”. You can compile without the definition but the linker will get real unhappy if you don’t have the definition when it’s trying to run. Here you’re looking at a statement of “qualifiers return_data_type symbol(arg_1_data_type arg_1_symbol,…);” Technically in function declarations you don’t need argument symbols, just the types, but it’s better to just have them for readability.

    Structs are different still. Here you’re telling the compiler that you’re going to have this struct definition somewhere else in the same translation unit, but the data type symbol will show up before the definition. So whenever the compiler sees that data type show up in a variable instance declaration it won’t reserve space right away but it has to have the struct definition before compilation ends. This is pretty straightforward syntax wise, “struct struct_name;” (Typedefs throw a syntax wrench into this that I won’t get into, it’s functionally the same though)

    One more thing you can do with variables during declaration is to “extern” them. This is more similar to function declaration, where you’re telling the compiler “hey you’re gonna see this symbol pop up, here’s how you use it, but it actually lives somewhere else k thx bye”. I personally don’t like calling this declaration since it behaves differently than normal declaration. This is the same as a normal variable declaration syntax with “extern” tossed in the front of the qualifiers.

    Definitions have two types: Function definitions contain the actual code that gets translated into instructions, Enum, struct, typedef definitions all describe memory requirements when they get used.

    Structs and enums will have syntax like “struct struct_name {blah,blah,blah};”, typedefs are just “typedef new_name old_name;”, and function definition “qualifiers return_data_type symbol(arg_1_data_type arg_1_symbol,…) {Blah,blah,blah}” (note that function definitions don’t need a ; at the end and here you do need argument symbols)

    Lastly, when you create a variable instance, if you say that you want that symbol to have value X all in one statement, by the standard that’s initialization. So “int foo = 5;” is declaration and initialization. Structs and arrays have special initialization syntax, “struct foo bar = {5, 6, 7};” where the numbers you write out in the list gets applied in order of the element names in the struct definition. You can also use named initialization for structs where it would look like “struct foo bar = {. element_one = 5, .e_two = 6, .e_three = 7};” This style syntax is only available for initialization, you cannot use that syntax for any other assignment. In other words you can’t change elements in bulk, you have to do it one at a time.

    C lets you get real wild and combine struct definition, struct instance declaration and initialization all into one! Though if I was your code reviewer I’d reject that for readability.

    <\wall-o-text>


  • needed to add a mechanic to slow time down

    The devs actually thought of that. There are two auxiliary time control songs. One slows down time by ~50%, the other jumps ahead to the next dawn/dusk. MM3D revised the latter to allow to jump to any top of the hour across the next 12 hours.

    Any of the scarecrows around town teach it to you just by talking to them, but they do so by describing the songs, not teaching you the notes


  • The way I think about Majora’s Mask as a Zelda game is that in addition to exploring the physical world, you’re also exploring time. That does necessitate “backtracking” by forcing time resets and a lot of waiting around if you don’t immediately know what you can be doing in parallel (though the two time control songs make that part easier).

    With the exception of the dungeons themselves, the game typically fast tracks getting you back to where you were when you just reset. Some mechanics like that the game forces on you pretty quickly (song of soaring fast travel), others it lets you figure out on your own (dungeon boss instant warp after beating them the first time).

    Side quests can be a bit more troublesome to deal with if you have to reset part way through, but each interaction point that you have to go through offers you another way to handle things (or to not and let another sequence of events happen).

    To your last point, the game really throws refillable items at you in the overworld, so a lot of times you can skip that (I’m not saying stocking up doesn’t take forever on reset, it does. You just don’t always have to)

    All in all I really love the time mechanic of the game and that let’s me forgive some of the other flaws of the game. If it fell flat, then yeah I can see how the game quickly becomes a chore. But I adore the game, hence the username


  • I feel like Win 10 default apps just waste so much screen real estate. I’ve been using Thunderbird for years and while 5 years ago I would agree the user interface is obtuse the refresh that happened a few years back really improved things. I’ve also never had stability problems and I have thunderbird tracking 7 email accounts with hundreds of thousands of emails total (I’m a data hoarder)

    Evolution on the other hand, hoo boy, I have to use it at work and despise it lol. That program gives me stability problems and frequently fails to interact with Exchange. Gives me a great excuse for missing meetings haha

    All said, Outlook desktop I think is superior to both Thunderbird and Evolution, I just don’t wanna pay for it