I’m note a programmer. I Don’t Understand Codes. How do I Know If An Open Source Application is not Stealing My Data Or Passwords? Google play store is scanning apps. It says it blocks spyware. Unfortunately, we know that it was not very successful. So, can we trust open source software? Can’t someone integrate their own virus just because the code is open?

  • Gibberish9031@lemmy.ml
    link
    fedilink
    arrow-up
    55
    arrow-down
    1
    ·
    1 year ago

    Yes, but the idea is that because the code is open source anyone can look at it and determine on their own whether it is in fact safe or not. Generally speaking the open source community is very good at figuring this kind of stuff out but I would say your fear is not necessarily out of place since nothing is 100% guaranteed. That said though, the more popular FOSS apps are quite safe.

      • squiblet@kbin.social
        link
        fedilink
        arrow-up
        8
        ·
        1 year ago

        The way people use npm has long been a problem - the basic concept of pulling in 4 dozen small snippets of code from repos all made by different people and rarely verified. It’s quite different than running one application with a group of developers who understand all the components and monitor/approve changes.

      • DogMuffins@discuss.tchncs.de
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        True, but these have been identified pretty quickly, they’re not insidiously harvesting data in the background over long periods.

        • Tanoh@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          Well, we have detected those that have been detected. It is possible that there are some sleeper repos no one has detected yet.

          But it is not really a problem or something bad with FOSS, just have to be careful when including and updating libraries, which you always have to be!

      • /home/pineapplelover@lemm.ee
        link
        fedilink
        arrow-up
        12
        ·
        1 year ago

        This is why lots of open source projects critical for privacy and security are audited. ProtonVPN, ProtonMail, Mullvad, Signal, Matrix, GrapheneOS, and more. Are audited and are very big projects with many eyes upon them. The more eyes, the more secure it will be.

        • dustojnikhummer@lemmy.world
          link
          fedilink
          arrow-up
          8
          arrow-down
          1
          ·
          1 year ago

          Yes, those are much more trustworthy than audited closed source projects. Just saying that “anyone can check” doesn’t mean “someone will check”

      • GVasco@discuss.tchncs.de
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        Well if the app is actively maintained the code is checked every time someone makes a push request to the main code base. You still have to trust the managers of the repository (code base) to verify every push request thoroughly, however, it’s in the best interest of the repository managers to do so to maintain trust in the project and it’s users.

      • DogMuffins@discuss.tchncs.de
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Well, not exactly.

        Some open source projects have many contributors, and while they’re working on fixing bugs and adding new features, the chances that no one would notice say, a key logger or crypto miner are very slim.

        Other opensource projects are maintained by large sophisticated organisations who would monitor security in some fashion. They would monitor for obvious things like transmitting data at the very least.

        That’s not a 100% guarantee of security, but it’s not as reckless as just hoping someone will check.

  • Z4rK@lemmy.world
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    1 year ago

    Check activity before trusting open source

    By default, FOSS is no more secure or privacy protected than proprietary software. However, it allows the community to peer review the code. So, a popular and active FOSS project can be trusted to be honest and not do nefarious things to your data or devices.

    Check activity on their code repository - Stars / Followers and Forks says something about popularity, Issues and pull requests tells you about activity (check comments or check recently closed issues and pull requests), as does the code commits itself.

    Edit: Changed wording from secure to trust / honesty. Not all code focus on security; in fact, most code doesn’t.

  • pjhenry1216@kbin.social
    link
    fedilink
    arrow-up
    22
    ·
    1 year ago

    You mention the Google Play issue. That is an example of a disadvantage of closed source (Android is open, the Google Play Protect is not). Google Play Protect is essentially static code analysis. Think of it almost like antivirus. It tries to look for anomalies in the code itself. But it’s not great. It can be tricked. And we don’t even know how good it is or what kind of checks it does.

    FOSS code has many people looking at it. You can compile it yourself. It’s extremely unlikely for something that’s remotely popular to have explicitly malicious code in it. Is it impossible? No. But just as you get folks deep diving video game code assets, you get people looking at code of many FOSS projects. Likely because they either want to contribute or make changes.

    It comes down to it being easier to find malicious actors in FOSS. Its just more difficult to hide than closed source.

    Why would you think closed source is any safer for any of the same reasons but worse? Closed source can just as easily (arguably more easily) steal your info (and many did but bury it in EULAs).

    • Serinus@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      3
      ·
      1 year ago

      I wouldn’t assume there are many people looking at most open source code. And even if there are, it’s not impossible to hide malicious code.

      Just because people can review it doesn’t mean they are reviewing it.

      It does introduce more risk of discovery though. Malicious code is easier to find, and there will be at least a username associated with it.

      • pjhenry1216@kbin.social
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        There are more people looking than there are elsewhere. And unless you’re suggesting the authors as being malicious (which can happen), most FOSS is reviewed. Especially larger ones. You can tell by the number of contributors. Smaller projects will surely be an issue, but popular ones do get reviewed, simply because many people want to be able to contribute.

        It’s almost certainly more than proprietary though. Like, all these risks still apply to proprietary.

    • zencat@kbin.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      How come users don’t have root access on Android even though Android is open?

      • pjhenry1216@kbin.social
        link
        fedilink
        arrow-up
        10
        ·
        1 year ago

        Because of the handset makers and wireless carriers (honestly more the latter than the former). It’s not because of Google or Android.

      • exscape@kbin.social
        link
        fedilink
        arrow-up
        8
        ·
        1 year ago

        Most phones use customized versions of Android and decide you shouldn’t have root access. It opens up security issues and makes it easier to bypass ads and DRM which they don’t like.

        You can get it on some phones, including Google’s.

        • zencat@kbin.social
          link
          fedilink
          arrow-up
          2
          arrow-down
          4
          ·
          1 year ago

          But why is Android even called opensource when there are restrictions by Google? Isn’t it a dangerous path when Google can decide to ban F-droid on the platform? What could stop them from doing that? How is the future of Android even guaranteed under such a greedy company like Google?

          • exscape@kbin.social
            link
            fedilink
            arrow-up
            6
            ·
            1 year ago

            The Android source code is available, but unfortunately that doesn’t mean that all phones are based solely on that source code. Almost all vendors (including Google) have closed-source additions to it.

            There are indeed people who agree with you. I do in principle too, but I can’t say this is something I think about much, which is probably how much people who even understand the issue feel. And most people don’t have a clue the issue exists.

            Google could ban F-droid on some phones, but not all. OEMs could overrule Google on such things with their custom Android builds, and even if they didn’t, users could create their own ROMs to solve the issue for rooted devices.

            • zencat@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Alright, I think now I understand. Thank you for the answers.

              OEMs as I understand are companies who make phones, they mostly care about profit and if there is an agreement in the future with Google or any corporation that would make them more money but restricts user control, they wouldn’t care less and go for more money. And day to day users would not care about it if they can use their favorite apps and browse internet.

              It seems like a wise idea to already think about making Android less and less reliant on a corporation. Especially looking at the recent example of Reddit, a sudden change or decision from companies is not impossible.

        • zencat@kbin.social
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          Alright, but why does Google gets to decide that? Why not make it so that users can get the root access like they can get the developers mode unlocked? On top of that, doesn’t them making it difficult or almost impossible to remove their apps defy the idea of opensource? How is Android even called opensource when the users have so much restriction put upon by Google?

          • Peruvian_Skies@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            1 year ago

            There is the Android Open Source Project (AOSP), and then there’s Google’s Android, which has both open and closed components (e.g. proprietary media codecs). There is such a thing as a pure, open-source Android, but what Google ships is not 100% open.

            Think of it like Google’s browser: AOSP is Chromium, the Android that comes with your phone is Google Chrome.

  • moobythegoldensock@geddit.social
    link
    fedilink
    arrow-up
    23
    arrow-down
    1
    ·
    1 year ago

    How do you know if a closed source application is stealing your data?

    With open source, you can learn to read it, or talk to a community of people who know how to read it. If even just 1 in 500 people who downloads the software looks at the source, there are external eyes on it. Whereas with closed source, no one but the creator is looking.

    Biggest thing is to still only install software you trust.

  • CthulhuDreamer@lemmy.world
    link
    fedilink
    arrow-up
    17
    ·
    1 year ago

    One more note about safety when it comes to open source or FOSS, is that you should use only the main repository and distributions provided by the official team. Often people clone existing repo, insert malicious code and publish it as their app on play store etc.

  • Whirlybird@aussie.zone
    link
    fedilink
    arrow-up
    16
    arrow-down
    3
    ·
    1 year ago

    No, open source code is no safer than closed source code by default. What it does is gives the opportunity for people to verify that it’s safe, but it doesn’t mean it is safe. Also just because some people have “verified” that it is safe doesn’t mean they didn’t just miss the vulnerabilities or nasty code.

    • 1847953620@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      Software companies are not known for their accountability over hacky code though, foss leads to better quality because it solves the accountability conflict of interest in an efficient way.

    • GlowHuddy@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Agreed. I’d say with open source it is harder to ‘get away’ with malicious features, since the code is out in the open. I guess if authors were to put those features, open nature of their code also serves as a bit of a deterrent sice there is a much bigger possibility of people finding out compared to closed source. However as you said it is not impossible, especially since not many people look through the code of everything they run. And even then it is not impossible to obfuscate it well enough for it not to be spotted on casual read-through.

    • RightHandOfIkaros@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Accounts that post “verifying code” can also be sock puppet accounts, so it is always good to double check for yourself if you know the programming language, or check the account history to see if they have verified other software from different writers that aren’t all connected to each other. Nothing sketchier than a verification ring, where accounts all verify for each other.

      • pjhenry1216@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        This is only an issue if it’s only been reviewed by one or two coders with zero history on the repo’s host. This is rare for anything that is remotely popular.

  • onescomplement@lemm.ee
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    In terms of telemetry, free software has the advantage over the proprietary counterpart.

    It’s a lot more complicated to hide telemetry without the user knowing in free software.

    You could always use a network tool, like iftop, to see network traffic on your PC. That could be a way too see if a program is phoning home. But you’ll probably want to use a suite of tools.

    • spizzat2@lemm.ee
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      1 year ago

      free software

      To make a common clarification: free as in “free speech”, not (necessarily) free as in “free beer”.

      Just because the software costs nothing, doesn’t mean that it’s not hiding something. In fact, the opposite is often true.

      I’m sure you know that. I’m just clarifying for OP, who isn’te a programmer.

  • Tl;Dr: you shouldn’t trust anyone or anything blindly or unconditionally. However, open source software and its community offer compelling reasons to trust it over proprietary software.

    Technically, if you do not read all of the source code of an application and all its dependencies, you can never be 100% sure that it isn’t doing nefarious things. For things that require a connection to the internet, you could monitor all connections to and from the application and its dependencies and see if it is making objectionable connections.

    However, in my view, open-source software is in general safer than closed-source software. Open-source software can be audited by any who knows the languages the program is coded in, whereas closed-source software can only be audited by the developer or the few parties they might authorize to see it. Closed-source apps can easily hide spyware because the source code is completely unavailable. Spyware could possibly be missed by the community, but it’s still a whole hell of a lot less likely to occur with so many eyes on the program.

    And practically, whenever an open-source software gets even close to including nefarious stuff, the community generates a huge hoopla about it.

    Also, Google Play Store is not open source! A better example would be F-Droid, which is an app store that is open-source. While I am not aware of F-Droid delivering spyware ala Google, it is still theoretically possible that they could screw up or be corrupted in the distant future. Therefore, we must stay vigilant, even with groups and people we trust. Practically, this just means… check their work once in a while. It wouldn’t kill you to learn a programming language; try Python for quick results. What I do is whenever an open-source software is written in a language I understand, I’ll pick a few files that look the most important and skim them to see that the program “does what it says on the tin”. Otherwise, I’ll check through the issues on GitHub for any weirdness.

    I haven’t even mentioned free and open-source software (free as in speech). I genuinely do not know how to convince people who are disinterested in their own freedom to consider FOSS options, or to do very nearly anything at all. For everyone else…FOSS software respects your freedom to compute as you please. We can quibble about different licenses and if and how effective they are at safeguarding user freedom, but at the end of the day, FOSS licenses are at least intended to give users back your freedom. In my view, it is mightily refreshing to finally take some freedom back!

  • Puzzle_Sluts_4Ever@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    2
    ·
    1 year ago

    Unless you know how to inspect it yourself (or trust someone who can): No.

    Yes, theoretically if someone were to insert malware hooks into Blender, the entire internet would be freaking out. Except… we live in a post truth society and that could very rapidly be astroturfed to the point of “nobody really knows but it is probably fine”. A good example in the opposite direction was that chinese (probably?) battle royale game a few years back (Rings of Something?) where, at the height of the BR wars, “somebody” claimed that it involved malware. To my knowledge, it didn’t, but it more or less killed the game in the eyes of most people.

    That said: Like with anything, what matters is the downstream users. If someone somehow introduced malware to glibc, the entire world would erupt in a manhunt because very significant percentages of the world run on that. Whereas, some closed source proprietary tool with a thousand customers might never notice.

    FOSS is more about ideology and what you want the future of computing/the economy to be. Any discussions of “safety” are in the same realm of “security through obscurity” where… yes, it can help but if you are relying on that you are already dead.

  • Kissaki@feddit.de
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    You shouldn’t see trustworthyness or trust as a binary system of full or nothing.

    You should assess - to your and the products possibilities - and then weigh risk and necessity and value.

    Source exposure makes it more likely people may look at it, without cause or when something seems surprising or questionable. Source available alone doesn’t mean you’d see concerns though - you’d need an obvious platform or publicity.

    FOSS may be funded and implemented by voluntary work or paid or sponsored, with or without control by the involved parties.

    Security scanning is a best effort weighing known and similarity and suspect parts against false positives and user and publisher inconvenience and hindrance. It can’t be perfect.

    Android Play Store security scanning can only scan for some things I’d consider security relevant and likely largely ignores questionable behavior that does not endanger device security.

    Established projects are more trustworthy than those that are not. Personal projects with a clear goal are more trustworthy because of likely hood of good intention and personal interest than those who seem obscure or unclear.

    Don’t trust blindly.

    Safety is a big topic and theme. So such a broad question can only be answered with broad assessment and overview.

  • Maharashtra@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Is FOSS really safe?

    It’s not an attempt at edginess, but the answer is that in the long run, IT NOTHING is safe. It might be now, it might be for some time, but theres no guarantee that even the most dependable piece of software will get some new update that will break some of its functionality, or the OS will interfere with it, thus breaking it.

    FOSS? It’s safe as a principle. If anyone has the access to the code, then any suspicious inclusion to it will be spotted quickly and patched up.

    …but the reality isn’t as straightforward.

  • rufus@discuss.tchncs.de
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Tl;dr: Don’t download random APKs from the internet, just because they claim to be FOSS. Just get them from F-Droid and you’re safe.

    Long answer: Depends on the project. Look how many people use it. If it’s a bunch, chances are other people also keep an eye on it. Even better if you get that sofware packaged. That means from the package manager of your linux distribution or - in your case, using Android - from F-Droid. This way somebody from that team has a look at it, and F-Droid even strips all those trackers from Apps. I’d say chances for a virus/spyware getting through the F-Droid process are close to none. Not more than chances are of a virus slipping past Google’s antivirus.

    (Play Store doesn’t do anything against excessive tracking.)

      • Peruvian_Skies@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Part of it is automated, part of it is real people looking at the source code. That’s done by sampling of course, since it’s not feasible to have someone manually look over every new update to every app.

        • rufus@discuss.tchncs.de
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 year ago

          Yeah. I haven’t looked it up, but a huge part seems to be manual labor. They have a good look at it when it gets included into the f-droid repository. The app then gets re-packaged to meet their standards and compiled from source. During this process tracking libraries and other (proprietary) components get stripped.

          They have an automated build server. I’m not sure if that does any additional tests or just checks if it can build the app. But this also prepares the updates.

          I doubt there are automated antivirus scans involved. Usually only windows users do that.

          And you have a community with many other users who use the same build of an app. They’ll file bugreports and maybe notice if an app stops working or starts consuming huge amounts of data and battery. Those users also tend to be more tech-savy than playstore users.

      • copygirl@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        From what I know, F-Droid compiles apps from source so you can be sure that the code you’re running is actually made from the source code that it claims to be built from. On most other platforms, the developers could be uploading malicious programs that actually have the code changed from what’s shared online as its source code. Then add the fact that other developers can and do look at the code, and what changes are made from version to version.