• 207 Posts
  • 1.17K Comments
Joined 3 years ago
cake
Cake day: June 15th, 2023

help-circle
  • For clarification, Organic Maps was the project that has been accused of mismanagement. A significant portion of that community departed to create this app, CoMaps. The goal from the outset was to create a more transparent and open community, hence the name, community maps or CoMaps.

    CoMaps is a fairly recent fork of what was already an excellent app, but likely with poor management ethics behind it. I’ve been extremely impressed by the rapid pace of development on a pretty sizable project. There are already hundreds of small (and a few not so small) improvements over the project it was forked from.


  • It’s a very different approach. Personally, I could never fit OSMand into my daily routine, but this one has been great.

    It’s not a reskin, it’s built from the ground up. Although it is technically a fork of a fork (not OSMand).

    For me, much more user-friendly and intuitive and even quicker. They both use open street maps data, but I think they are worlds apart. I haven’t done any testing with OSMand for a couple years, so I couldn’t tell you which specific features are different.

    I find it very easy to read, day or night. It’s quick to add a destination for navigation. It’s very easy to create updates directly from the app that will upload to OSM.




















  • Just to be clear, companies know that LLMs are categorically bad at giving life advice/ emotional guidance. They also know that personal decision making is the most common use of the software. They could easily have guardrails in place to prevent it from doing that.

    They will never do that.

    This is by design. They want people to develop pseudo-emotional bonds with the software, and to trust the judgment in matters of life guidance. In the next year or so, some LLM projects will become profitable for the first time as advertisers flock to the platforms. Injecting ads into conversations with a trusted confidant is the goal. Incluencing human behaviour is the goal.

    By 2028, we will be reading about “ChatGPT told teen to drink Pepsi until she went into a sugar coma.”