a2128 20 hours ago
    Why we collect telemetry

    ...our team needs visibility into how features are being used in practice. We use this data to prioritize our work and evaluate whether features are meeting real user needs.

I'm curious why corporate development teams always feel the need to spy on their users? Is it not sufficient to employ good engineering and design practices? Git has served us well for 20+ years without detailed analytics over who exactly is using which features and commands. Would Git have been significantly better if it had collected telemetry, or would the data not have just been a distraction?

  • embedding-shape 20 hours ago

    > Is it not sufficient to employ good engineering and design practices?

    It's not that it's insufficient, new developers, product people and designers literally don't know how to make tasteful and useful decisions without first "asking users" by experimenting on them.

    Used to be you built up an intuition for your user base, but considering everyone is changing jobs every year, I guess people don't have time for that anymore, so literally every decision is "data driven" and no user is super happy or not anymore, everyone is just "OK, that's fine".

  • j_maffe 20 hours ago

    > Would Git have been significantly better if it had collected telemetry, or would the data not have just been a distraction?

    I'm not sure if you're implying it's obvious but it's not obvious to me that it would be unhelpful.

    • a2128 20 hours ago

      Just anecdotally, I get the feeling telemetry often does more harm than good, because it's too easy to misinterpret or lie with statistics. There needs to be proper statistical methodology and biases need to be considered, but this doesn't always happen. Maybe a contrived example, but someone wants to show high impact on their next performance review? Implement the new feature in such a way that everyone easily misclicks it, then show the extremely high engagement as demonstration that their work is a huge success. For Git, I'm not sure it would be widely adopted today if the development process was mainly telemetry-driven rather than Torvalds developing it based solely on his expertise and intuition.

      • wongarsu 19 hours ago

        Not to mention it's really hard to statistically tell the difference between people spending a lot of time with a feature because it's really useful or because it's really difficult to get to do what you want

        Telemetry is a really poor substitute for actually observing a couple of your users. But it's cheap and feels scientific and inclusive/fair (after all you are looking at everyone)

        • Sytten 19 hours ago

          That is just poor analytics IMO, if you have a good harness you can definitely tell if a feature is not well designed. You have to optimize for things like number of clicks to perform an operation not time spent in app.

    • sammorrowdrums 19 hours ago

      I think the seeing the underutilized commands and flags (with real data not just a hunch) would have helped identify where users were not understanding why they should use it, and could have helped refine the interface and docs to make it gradually more usable.

      I mean no solution is perfect, and some underused things are just only sometimes extremely useful, but data used smartly is not a waste of time.

  • Forgeties79 20 hours ago

    This is where (surprise surprise) I respect Valve. The hardware survey is opt in and transparent. They get useful info out of it and it’s just..not scummy.

    There are all sorts of best practices for getting info without vacuuming up everyone’s data in opaque ways.

    • embedding-shape 20 hours ago

      To be fair, you can be pretty sure they're heavily leveraging all their store data, in loads of ways. They probably sit on the biggest dataset of video game preferences for people in general, and I'm betting they make use of it heavily.

      • Forgeties79 19 hours ago

        If you have details on what they’re collecting and how they’re using it/if they’re selling it to advertisers/etc, I’m happy to make a judgment.

        I’m not saying they don’t engage in any of those practices, I am specifically talking about the hardware survey.

        • embedding-shape 19 hours ago

          > If you have details on what they’re collecting

          Well, you can start with everything a typical HTTP request and TCP connection comes with, surely they're already storing those things for "anti-fraud practices", wouldn't be far to imagine this data warehouse is used for analytics and product decisions as well.

          • Forgeties79 18 hours ago

            >wouldn’t be far to imagine

            I explicitly said I agree it’s a distinct possibility, but that’s not proof. If you have actual info on what they collect and how it’s used I can assess it. As it is we don’t know the extent or uses at all, we are speculating.

            • embedding-shape 18 hours ago

              Personally if I don't have any evidence of something, I'll leave it unsaid if I like what Valve is doing about that something or not. Saying "We don't have evidence either way, we're just speculating, so therefore I respect what Valve does" feels like the wrong way around. But you do you :)

              • Forgeties79 15 hours ago

                >But you do you :)

                This was unnecessary and patronizing. Clearly this won't be productive as it's clear what I am trying to say, but you seem to intent on taking a narrow interpretation of my point to paint me as someone who blindly trusts Valve (I don't). Have a good one man.

      • LtWorf 19 hours ago

        And you think microsoft isn't already doing that?

        • embedding-shape 19 hours ago

          Of course they do, why would I believe otherwise?

    • salomonk_mur 20 hours ago

      They are analyzing absolutely every click you make, I can guarantee it.

      • Forgeties79 20 hours ago

        And if you provide evidence of this (and yes I think it is possible) then I will say it’s bad.

        The hardware survey is not that.

  • gardaani 20 hours ago

    It isn't only corporate development teams — open source development teams want to spy on their users, too. For instance, Homebrew: "Anonymous analytics allow us to prioritise fixes and features based on how, where and when people use Homebrew." [1]

    [1] https://docs.brew.sh/Analytics

    • leftnode 19 hours ago

      Is it spying if:

      1. It's anonymous

      2. They're telling you they're doing it

      3. You can opt out of it

      • brontosaurusrex 18 hours ago

        "I'am watching you" is neat way to communicate with people?

  • giancarlostoro 20 hours ago

    > I'm curious why corporate development teams always feel the need to spy on their users?

    I've repeatedly talked about this on HN; I call it Marketing Driven Development. It's when some Marketing manager goes to your IT manager and starts asking for things that no customer wants or needs, so they can track if their initiatives justify their job, aka are they bringing in more people to x feature?

    Honestly, with something as sensitive as software developer tools, I think any sort of telemetry should ALWAYS be off by default.

  • figmert 20 hours ago

    While I agree, I personally always opt out if I'm aware, and hate it when a tool suddenly gets telemetry, I don't think Git is comparable, same with Linux.

    Linux and Git are fully open source, and have big companies contribute to it. If a company like Google, Microsoft etc need a feature, they can usually afford to hire someone and develop _and_ maintain this feature.

    Something like gh is the opposite. It's maintained by a singular organisation, the team maintaining this has a finite resources. I don't think it's much to ask for understand what features are being used, what errors might come up, etc.

    • LtWorf 19 hours ago

      Good news! gh is actually a client of a web API so they can just read their logs to know what's being used!

  • hansmayer 20 hours ago

    > I'm curious why corporate development teams always feel the need to spy on their users

    Unfortunately this is due to a large part of "decision makers" being non-technical folks, not being able to understand how the tools is actually used, as they don't use such tools themselves. So some product manager "responsible" for development tooling needs this sort of stuff to be able to perform in their job, just as some clueless product manager in the e-commerce absolutely has to overload your frontend with scripts tracking your behaviour, also to be able to perform in their job. Of course the question remains, why do those jobs exist in the first place, as the engineers were perfectly capable of designing interaction with their users before the VCs imposed the unfortunate paradigm of a deeply non-technical person somehow leading the design and development of highly technical products...So here we are, sharing our data with them, because how else will Joe collect their PM paycheck, in between prompting the AI for his slides and various "very important" meetings...

    • ryanmcbride 19 hours ago

      Man if I had a nickle for every time a PM asked me to violate user privacy for the purposes of making a slide that will be shown to their boss for 2.5 seconds I'd probably make enough to actually retire someday.

      • phillipcarter 15 hours ago

        There's obviously misuse and abuse in the world, but telemetry from production systems out in the wild is incredibly useful for all kinds of decision-making. It's silly to dismiss it outright.

        • hansmayer 3 hours ago

          Some of it, yes. But not the kind that's put in place so the clueless Joe who used to do copy-writing 3 years ago can keep their high salary because they decided to join the proverbial "IT".

  • charcircuit 20 hours ago

    Git notoriously has had performance issues and did not scale and has had a horrible user interface. Both of these problems can be measured using telemetry and improvements can be measured once telemetry is in place.

    • LtWorf 19 hours ago

      How was it notorious if git has no telemetry? According to you without telemetry nothing can be known, and nothing can become notorious.

      • charcircuit 18 hours ago

        Because you already have a general idea by just using it. "Oh wow this is slow." Telemetry gives you hard data.

  • gordonhart 20 hours ago

    The impact of a few more network calls and decreased privacy is basically never felt by users beyond this abstract "they're spying on me" realization. The impact of this telemetry for a product development team is material.

    Not saying that telemetry more valuable than privacy, just that it's a straightforward decision for a company to make when real benefits are only counterbalanced by abstract privacy concerns. This is why it's so universally applied across apps and tools developed commercially.

    • TheDong 19 hours ago

      For most CLIs, I definitely feel extra network calls because they translate to real latency for commands that _should_ be quick.

      If I run "gh alias set foo bar", and that takes even a marginally perceptible amount of time, I'll feel like the tool I'm using is poorly built since a local alias obviously doesn't need network calls.

      I do see that `gh` is spawning a child to do sending in the background (https://github.com/cli/cli/blob/3ad29588b8bf9f2390be652f46ee...), which also is something I'd be annoyed at since having background processes lingering in a shell's session is bad manners for a command that doesn't have a very good reason to do so.

      • SchemaLoad 10 hours ago

        If it's done in a background process then it won't impact the speed of the tool at all. When the choice is between getting data to help improve the tool at the cost of "bad manners" whatever that means, the choice is pretty easy.

  • dale_glass 20 hours ago

    > I'm curious why corporate development teams always feel the need to spy on their users? Is it not sufficient to employ good engineering and design practices?

    No, because users have different needs and thoughts from the developers. And because sometimes it's hard to get good feedback from people. Maybe everyone loves the concept of feature X, but then never uses it in practice for some reason. Or a given feature has a vocal fan base that won't actually translate to sales/real usage.

    > Would Git have been significantly better if it had collected telemetry, or would the data not have just been a distraction?

    I think yes, because git famously has a terrible UI, and any amount of telemetry would quickly tell you people fumble around a lot at first.

    I imagine that in an alternate world, a git with telemetry would have come out with a less confusing UI because somebody would have looked at the stats and for instance have added "git restore" right from the very start, because "git checkout -- foo.txt" is an absolutely unintuitive command.

    • wongarsu 19 hours ago

      A more intuitive git UI would reduce engagement. Do you really want to cut a 30 minute git session down to five minutes by introducing things like 'git restore' or 'git undo'? /s

    • throwaway27448 19 hours ago

      > because git famously has a terrible UI

      Thankfully, github has zero control over git. If they did have control they would have sank the whole operation on year one

      > because somebody would have looked at the stats and for instance have added "git restore" right from the very start, because "git checkout -- foo.txt" is an absolutely unintuitive command.

      How is git restore any better? Restoring what from when? At least git checkout is clear in what it does.

      • dale_glass 19 hours ago

        > How is git restore any better? Restoring what from when? At least git checkout is clear in what it does.

        And this is exactly where disconnects happen, and where you need telemetry or something like it to tell you how your users actually use the system, rather than imagining how they should.

        A technical user deep into the guts of Git thinks "you need to check out again this specific file".

        A novice thinks "I want to restore this file to the state it had before I touched it".

        Now we can argue about whether "restore" is the ideal word here, but all the same, end users tend to think it terms of "I want to undo what I did", and not in terms of git internals.

        So a hypothetical git with telemetry would probably show people repeatedly trying "git restore", "git undo", "git revert", etc, trying to find an undo command.

        • bfivyvysj 19 hours ago

          > A technical user deep into the guts of Git thinks "you need to check out again this specific file".

          This is a fundamental misunderstanding of both the user base who are by design all technical, and the use case that this tool serves: deeply technical and specific actions to a codebase.

          Git is not just software. It is also a vernacular about how to reason about code change. You can't just make arbitrary commands do magic stuff by default and then expect that vernacular to be as strong as it is today.

          Those "ergonomics" you're asking for already existed in other tools like CVS and subversion, they are specifically why those tools sucked and why git was created.

          • dale_glass 19 hours ago

            Nonsense. The "git restore" command is now an official part of git, and nothing is being lost because it's technically a git-checkout underneath. It's just a thin UI on top for convenience, nothing is being sacrificed. The old commands still work just like before.

            CVS and Subversion have nothing to do with this, they were extremely different to Git in the way they worked and lost for many reasons that have nothing to do with having command names understandable to normal people.

        • throwaway27448 18 hours ago

          I don't think this is worth the effort. A user either tries to understand the data structures underlying the tool or they don't. We don't market cars to babies, right? We don't pretend the car floats around—it's inherently based on engines and wheels, and the user must understand this to operate it safely. Similarly, git is inherently based around objects and graphs, and its operations should reflect this. "Restore" has simply no meaning in this world. Restore what to when in a world where time doesn't exist?

          Surely, telemetry should help educate the tool maker to reveal the underlying model rather than coercing the model to reflect a bastardized worldview (as restore seems to).

          Trying to wedge git into workflows that don't operate around git seems like a fool's errand. Either we must build tools around the workflow, or we must build the workflow around the tool.

          This is part of why I find jujustsu so unintuitive: there is no clear model it's built around, only some sense of how to work with files I apparently lack. But perhaps it is the perfect tool for some workflow I have not yet grasped!

        • throwatdem12311 18 hours ago

          It’s total waste of time because both are going to be maintained in perpetuity. Increasing the maintenance burden and attack surface of git.

          “a novice thinks”

          Just learn your damn tools and stop whining.

          • chrishill89 17 hours ago

            The git-restore(1) implementation looks like about 35 lines of code. Then add a little more complexity for some apparent common functions that needed to be factored out.

            For a dedicated "restore" it's worth it to me... (who will not maintain it)

            • throwaway27448 17 hours ago

              At the hidden cost of educating millions of users how git actually operates once they can't restore a file

              • chrishill89 16 hours ago

                Neither of these two commands are any more really-operates than the other.

                • throwaway27448 16 hours ago

                  How do you figure? Are you discarding the semantics of how people invoke git? If so why advocate for "restore" to begin with?

                  • chrishill89 15 hours ago

                    I don’t know what the semantics of invoking Git means?

                    These two commands operate on the same level of abstraction. And they should be equally powerful, which means that whichever you choose to learn will be able to serve all of your restore-content needs. That's what I mean.

                    Of course there is always the pedagogic cost of having two similar commands.

                    • throwaway27448 15 hours ago

                      Well surely people who use git and who also know english will think of restoring something. You want to restore what from when? Git offers no concept of time and in fact believing that it does will hamstring your efforts to use it. That's the cost. Why cater to this concept of before and after when this undermines usage?

    • dijksterhuis 19 hours ago

      > I think yes, because git famously has a terrible UI, and any amount of telemetry would quickly tell you people fumble around a lot at first.

      1. git doesn’t have a UI, it’s a program run in a terminal environment. the terminal is the interface for the user.

      2. git has a specific design that was intended to solve a specific problem in a specific way. mostly for linux kernel development. so, the UX might seem terrible to you — but remember that it wasn’t built for you, nor was it designed for people in their first ever coding boot camp. that was never git’s purpose.

      3. the fact that every other tool was designed so poorly that everyone (eventually, mostly) jumped on git as a new standard is an expression of the importance of designing systems well.

      • ForHackernews 19 hours ago

        UI means "user interface". For a CLI tool the UI is the commands and modifiers it offers on the terminal.

        • dijksterhuis 19 hours ago

          i lump those into user experience (UX) stuff as it’s more leaning towards “flow of user action” etc.

          • ulbu 18 hours ago

            ok, you do, doesn’t mean that the difference others make of it are necessarily wrong, as the tone of your first comment suggested.

            • dijksterhuis 18 hours ago

              i believe that those people are wrong :shrug:

              doesn’t change the bigger, more important fact that the struggles people have with git stem from the system design. i.e. the thing that ultimately determines what commands people need to run in what order (see points 2 and 3).

          • Rapzid 17 hours ago

            Yeah, UI impacts UX.

            Also git has a UI.

      • Centigonal 19 hours ago

        "UI" is a category that contains GUI as well as other UIs like TUIs and CLIs. "UX" encompasses a lot of design work that can be distilled into the UI, or into app design, or into documentation, or somewhere else.

        • dijksterhuis 19 hours ago

          > “UX" encompasses a lot of design work that can be distilled into the UI

          like how git needs you to “commit” changes as if you’re committing a change to a row in a database table? thats a design/experience issue to me, not an “it has commands” issue.

      • incrudible 19 hours ago

        Mercurial was better than Git on almost any metric, it eludes me why it lost out to Git, perhaps because it lacked the kernel hacker aura, but also because it did not have a popular repository website with cute mascot going for it. Either way, tech history is full of examples of better designs not winning minds, due to cost, market timing, etc. And now with LLMs being trained on whatever was popular three years ago, we may be stuck with it forever.

        • JoshTriplett 18 hours ago

          > Mercurial was better than Git on almost any metric, it eludes me why it lost out to Git

          I used Mercurial, professionally, back when there were a half-dozen serious VCS contenders, to contribute to projects that used it. I disliked it and found it unintuitive. I liked Git much better. Tastes vary.

          Git made me feel like I was in control. Mercurial didn't.

          Mercurial's handling of branches was focused on "one branch, one working directory", and made it hard to manage many branches within one working directory.

          Mercurial's handling of branches also made it really painful to just have an arbitrary number of local experimental branches that I didn't immediately want to merge upstream. "Welcome to Mercurial, how can I merge your heads right now, you wanted to do that right now, right?"

          Git's model of "a branch is just a pointer to a commit, commits internally have no idea what branch they're on" felt intuitive and comfortable.

    • trinsic2 18 hours ago

      I think the big problem with Telemetry is that it's too much of a black box. There is 0 transparency on how that data it really used and we have a long history of large corporates using this data to build prediction products that track people by finger printing behavior though these signals. There is too much at stake right now around this topic for people to trust any implementation.

    • chrishill89 17 hours ago

      Didn't Go propose opt-out telemetry but then the community said no?

      Compilers and whatnot seem to suffer from the same problem that programs like git(1) does. Once you've put it out there in the world you have no idea if someone will still use some corner of it thirty years from now.

  • reaperducer 19 hours ago

    I'm curious why corporate development teams always feel the need to spy on their users?

    Because they're too shy, lazy, or socially awkward to actually ask their users questions.

    They cover up this anxiety and laziness by saying that it costs too much, or it doesn't "scale." Both of these are false.

    My company requires me to actually speak to the people who use the web sites I build; usually about every ten to twelve months. The company pays for my time, travel, and other expenses.

    The company does this because it cares about the product. It has to, because it is beholden to the customers for its financial position, not to anonymous stock market trading bots a continent away.

    • Sytten 19 hours ago

      Respectfully I think your argument defeats itself. If you can only speak to your users once every 10-12 months it means your process doesn't scale by definition. Good analytics (not useless vanity metrics) should allow you to spot a problem days after it was launched not wait 3 quarters for a user to air their grievances.

      • Citizen_Lame 18 hours ago

        Ah yes, all the spyware on Windows 11 really helped Microsoft scale up development and make it the best Windows version ever.

        Now, let's replicate this with GitHub. What can go wrong?

      • reaperducer 18 hours ago

        You're describing a different problem.

        Bug fixing absolutely gets taken care of immediately, and our customers are very active in telling us about them through these strange new feedback mechanisms known as "e-mail" and "a telephone."

        But we don't spy on people to fix bugs.

        Nothing that the big tech "telemetry" is doing is about bug fixes. In the article we're all talking about the spying that Microsoft proposes isn't to fix bugs. Re-read what it wrote. It's all for things that may not appear for weeks, months, or years.

        And to think that a trillion-dollar company like Microsoft can't figure out how, or doesn't have the money available to scale real customer feedback is just sticking your head in the sand and making excuses.

        Microsoft doesn't need people to apologize for its failure.

      • goosejuice 16 hours ago

        Microsoft has a horde of developers that fit the entire breadth of gh usage. They could fix issues prior to a release if they wished to without opt-out client side telemetry.

  • Sytten 19 hours ago

    I used to believe that it was not necessary until I started building my own startup. If you dont have analytics you are flying blind. You don't know what your users actually care about and how to optimize a successful user journey. The difference between what people tell you when asked directly and how they actually use your software is actually shocking.

    • throwaway27448 19 hours ago

      I'm pretty ok with the github cli tool team flying blind. The tool isn't exactly a necessary part of any workflow. You don't need telemetry to glean that

      • merlindru 19 hours ago

        that's akin to saying "i do not need their product therefore i don't care"... so what's your point? someone may have made it part of their workflow!

        • throwaway27448 16 hours ago

          True. Some people shouldn't use git if their workflow doesn't beg it.

    • embedding-shape 19 hours ago

      You're only flying blind if you make decisions not looking and thinking. Analytics isn't the only way to figure out "what your users actually care about", you can also try the old school way, commonly referred to as "Talking with people", then after taking notes, you think about it, maybe discuss with others. Don't take what people say at face value, but think about it together with your knowledge and experience, and you'll make even better product decisions than the people who are only making "data driven decisions" all the time.

      • Sytten 19 hours ago

        We do both and they yield different learnings. They are complementary. We also have an issue tracking board with upvotes. I would say to your point that you can't improve what you don't measure.

        • bfivyvysj 19 hours ago

          I would say to your point that you can't not spy on me while also spying on me. Maybe just don't?

          • lukevp 19 hours ago

            If I was running a physical business and I wrote down each person’s name and credit card number and the exact time and order they placed, that would be pretty invasive and “spying”. If I write down how many units I sold of each item per day, and the volume of transactions by credit card vs cash, it’s anonymized and I don’t think this would generally be considered “spying”, just normal business metrics. How’s the latter much different than anonymized product analytics?

            • lamasery 19 hours ago

              Watching me use my computer in my house or office is spying.

              Aggregating request statistics server-side unless you're only generating those requests to spy on what I'm doing on my computer is more like the not-spying you're talking about.

              • vlovich123 18 hours ago

                The logical conclusion is you’re asking for no local products and everything to run server side. It’s kind of a ridiculous position that doesn’t change the spying being done other than it’s on the other side of a browser.

                • lamasery 18 hours ago

                  I accounted for this in my post. Obviously if you’re making requests just so you can spy, that’s spying.

                  • vlovich123 6 hours ago

                    No you didn’t. If I build you a web video editor, is that because I want to spy on you or because I want to make deployment easier and reduce install friction?

                    You’re making a distinction that puts you in the privileged judge position of evaluating if a service is making requests just so you can spy vs what the app author’s might believe is a critical design feature in how they want the product to operate.

              • embedding-shape 18 hours ago

                > Watching me use my computer in my house or office is spying.

                I agree, but once you cross the borders out to the internet, I'd say you need to stop seeing that as "Me sitting at my computer at home", because you're actually "on someone else's property" at that point essentially. And I say this as someone who care greatly about preserving personal privacy.

                • lamasery 18 hours ago

                  I deeply hate that this attitude took over even among “hackers”.

                  Watching people move their mouse and click stuff on “your webpage” is fucking spying. It’s in my browser. On my machine. Not running on your hardware.

                  Tracking what I do on my own computer doesn’t stop being spying because the program I’m doing stuff in can make network requests. WTF.

                  • embedding-shape 17 hours ago

                    > Watching people move their mouse and click stuff on “your webpage” is fucking spying. It’s in my browser. On my machine. Not running on your hardware.

                    Well, I was mainly talking about network requests, which are quite literally served by "my hardware" when your client reaches out to my servers, and they agree to serve your client. I do agree that it sucks that browser viewports now also are considered "mine" from the perspective of servers, but you do have a choice to execute that code or not, you can always say no.

                    I don't think it's as much "this attitude took over", people saying that the internet is the wild west and warning you "browse at your own peril" has been around for as long as I can remember.

                    • lamasery 17 hours ago

                      Yeah server logs don’t bother me. I’m requesting a resource, you unavoidably see that happen.

                      The attitude that’s changed is that in the 90s and 00s a program that sent information about what you’re doing that wasn’t necessary and expected for how it operates would have been instantly, popularly, and unequivocally labeled spyware by a programmer crowd. Now it’s normal and you get a bunch of folks claiming it’s ok.

              • mgfist 17 hours ago

                Most telemetry is more along the lines of "user spent N minutes on platform, clicked on these things, looked at these other things" etc etc. And the primary way devs use this data is by aggregating across all users and running a/b tests or viewing longer term trends.

                Are some companies spying on you the way you say? Yea, probably. Most of us just want data to know what's working and what's not.

      • johnfn 19 hours ago

        Sure, you can spend the weeks to months of expensive and time consuming work it takes to get a fuzzy, half accurate and biased picture of what your users workflows look like through user interviews and surveys. Or you can look at the analytics, which tell you everything you need to know immediately, always up to date, with perfect precision.

        Sometimes HN drives me crazy. From this thread you’d think telemetry is screen recording your every move and facial expression and sending it to the government. I’ve worked at places that had telemetry and it’s more along the granularity of “how many people clicked the secondary button on the third tab?” This is a far cry from “spying on users”.

        • embedding-shape 19 hours ago

          > Sure, you can spend the weeks to months of expensive and time consuming work it takes to get a fuzzy, half accurate and biased picture of what your users workflows look like through user interviews and surveys. Or you can look at the analytics, which tell you everything you need to know immediately, always up to date, with perfect precision.

          Yes, admittedly, the first time you do these things, they're difficult, hard and you have lots to learn. But as you do this more often, build up a knowledge base and learn about your users, you'll gain knowledge and experience you can reuse, and it'll no longer take you weeks or months of investigations to answer "Where should this button go?", you'll base it on what you already know.

          • acedTrex 18 hours ago

            So if you don't want to spend the time doing that, or as is more accurate in corporate settings, the general turnover of the team is high enough that no one is around long enough to build that deep foundational product knowledge, and to be frank most people do not care enough.

            This is why telemetry happens, its faster, easier and more resilient to organizational turmoil.

            • embedding-shape 18 hours ago

              > This is why telemetry happens, its faster, easier and more resilient to organizational turmoil.

              I don't disagree with that, I was mainly talking about trying to deliver an experience that makes sense, is intuitive and as helpful and useful as possible, even in exchange for it taking longer time.

              Of course this isn't applicable in every case, sometimes you need different tradeoffs, that's OK too. But that some favor quality over shorter implementation time shouldn't drive people crazy, it's just making different tradeoffs.

              • acedTrex 18 hours ago

                > even in exchange for it taking longer time.

                I think in terms of corporate teams this is the issue a lot of times, people just are not on the team long enough to build that knowledge. Between the constant reorgs, these days layoffs and other churn the no one puts in the years required to gain the implicit knowledge. So orgs reach for the "tenure independent knowledge base.

          • hombre_fatal 18 hours ago

            Asking users isn't a substitute for usage data.

            Usage data is the ground truth.

            Soliciting user feedback is invasive, and it's only possible for some questions.

            The HN response to this is "too bad" but it's a thought-terminating response.

            • AlotOfReading 18 hours ago

              It goes the other way as well. Usage data isn't equivalent to asking users either. A solid percentage of bad decisions in tech can be traced to someone, somewhere forgetting that distinction and trusting usage data that says it's it's okay to remove <very important feature> because it's infrequently used.

              • junon 17 hours ago

                This. If I'm forced to use a feature I hate because it's the only way to do something, the "ground truth" reflects that I like that feature. It doesn't tell the whole story.

                • groby_b 16 hours ago

                  Most metrics teams are reasonably competent and are aware of that. Excepting "growth hackers"

                  I haven't been in a single metrics discussion where we didn't talk about what we're actually measuring, if it reflects what we want to measure, and how to counterbalance metrics sufficiently so we don't build yet another growthhacking disaster.

                  Doesn't mean that metrics are perfect - they are in fact aggravatingly imprecise - but the ground truth is usually somewhat better than "you clicked it, musta liked it!"

                  • dpark 11 hours ago

                    Eh, there are a lot of cases where teams A/B test their way into a product that sucks.

                  • chillfox 8 hours ago

                    And yet, the observable evidence of changes in software that collect metrics directly contradict this.

              • hombre_fatal 16 hours ago

                Yeah, it's not a good discussion without concrete examples.

                One: Building a good UX involves guesswork and experiments. You don't know what will be best for most users until you try something. You will often be wrong, and you rarely find the global maximum on the first try.

                This applies to major features but also the most trivial UI details like whether users understand that this label can be clicked or that this button exists.

                Two: Like all software, you're in a constant battle to avoid encumbering the system with things you don't actually need, like leaving around UI components that people don't use. Yet you don't want to become so terse with the UI that people find it confusing.

                Three: I ran a popular cryptocurrency-related service where people constantly complained about there being no 2FA. I built it and polished a UX flow to both hint at the feature and make it easy to set up. A few months later I saw that only a few people enabled it.

                Was it broken? No. It just turns out that people didn't really want to use 2FA.

                The point being that you can be super wrong about usage patterns even after talking to users.

                Finally: It's easy to think about companies we don't like and telemetry that's too snitchy. I don't want Microslop phoning home each app I open.

                But if we only focus on the worst cases, we miss out on the more reasonable cases where thoughtful developers collect minimal data in an earnest effort to make the UX better for everyone.

                • ragall 15 hours ago

                  > You don't know what will be best for most users until you try something.

                  That's because you don't understand your users. If you did, you wouldn't need to spy on them.

                  > you rarely find the global maximum on the first try

                  One never finds the "global maximum" with telemetry, at best a local sort-of maximum. To find what's best, you need understanding, which you never get from telemetry. Telemetry tells you what was done, not why or what was in the people's mind when it was done.

            • embedding-shape 17 hours ago

              > Usage data is the ground truth.

              For what, precisely? As far as I know, you can use it to know "how much is X used" but not more than that, and it's not a "ground truth" for anything besides that.

            • Brian_K_White 15 hours ago

              The ground truth that I never click on Stargate on Netflix is completely at odds with the actual truth that I love Stargate and want more of it and things like it.

              What the ground truth usage data is completely ignorant of is that Netflix's copy is a crappy blurry transfer, and so I got dvds instead.

              • dpark 11 hours ago

                Telemetry doesn’t tell the “why”. You never clicking in Stargate in Netflix is apparently true, so the telemetry isn’t wrong. It just doesn’t answer why.

                • Brian_K_White 6 hours ago

                  Le duh. The whole point is that the perfectly true data is misleading and uninformative, and the "ground truth usage data" argument has a plot hole.

                  It's not that it has no value at all, it's just that it's stupid to know one thing (how to collect usage data) and think that is all you need to know and that that obviates all other sources of understanding.

                  If you wish to collect money from human customers, you have to be some minimum level of human yourself. Talking to your customers is not some icky hardship to be avoided and replaced with nice bash script.

              • selcuka 8 hours ago

                Sure, but Netflix is not interested in whether you love Stargate or not. Telemetry says that you never click it, so it's ok to remove it from their catalogue (which is correct).

                Now, they could've done a better job by increasing the quality, but that's a further (and costly) optimisation.

                • Brian_K_White 6 hours ago

                  It's not correct. I paid someone else for dvds. A little more of that and I may consciously question why I pay both netflix and ebay.

                • mort96 2 hours ago

                  Netflix should be very interested in a fact like "Netflix has show X, but Netflix subscribers who love show X choose to watch it someplace else due to issues with Netflix".

            • codedokode 14 hours ago

              Then pay for the data if you need it so bad.

            • yjftsjthsd-h 10 hours ago

              > Asking users isn't a substitute for usage data.

              Sure.

              > Usage data is the ground truth.

              Absolutely not. That's how you get "we buried this feature and nobody used it, so clearly nobody wants it".

          • johnfn 16 hours ago

            You seem to be interpreting my position as saying that one should only use telemetry to make decisions. Of course, no one reasonable would hold that position! What I’m saying is that only relying on user interviews without supplementing them with analytics would be knowingly introducing a blind spot into how you understand user behavior.

            • embedding-shape 16 hours ago

              Yes, probably because someone else said "If you dont have analytics you are flying blind" which I initially replied to, then when you replied to my reply, I took that as agreeing with parent, which isn't necessarily true.

              > What I’m saying is that only relying on user interviews without supplementing them

              I also took your "spend the weeks to months of expensive and time consuming work [...] Or you can look at the analytics" as a "either this or that proposition", where if we're making that choice, I'd go with qualitative data rather than quantitative, regardless of time taken. But probably it comes down to what tradeoffs we're willing to accept.

              • johnfn 12 hours ago

                Maybe it just comes down to how you interpret "flying blind", because I do tend to agree with that statement. Telemetry is one half of the puzzle, user interviews are the other. Without either I would argue you are flying blind; I think you agree here though.

                • zerkten 6 hours ago

                  In enterprise, you have little chance of getting the real story from end users in many cases. IT will also tell you that things are used one way, only for analytics to tell you it's the opposite. If you spend some of your UX research budget to deep dive on the area you can then finally get to the bottom of it.

                  I think the root of the complaints here is prioritization. The things they care about are prioritized. Qualitative feedback is likely already telling PMs that something is wrong and really should be fixed, but other feedback has more data supporting it.

        • ambicapter 18 hours ago

          > with perfect precision.

          Precision isn't accuracy and all that.

        • sdevonoes 18 hours ago

          Telemetry is the previous obvious step to surveillance. Not the telemetry you implement in your own small bus, but at the scale of microsoft, apple, meta… yeah

        • Lammy 16 hours ago

          > and sending it to the government

          It literally is. The network itself is always listening: https://en.wikipedia.org/wiki/Room_641A

          The mere act of making a network connection leaks my physical location, the time I'm using my computer, and the fact that I use a particular piece of software. Given enough telemetry endpoints creates a fingerprint unique to me, because it is very unlikely that any other person at the same physical location uses the exact same set of software that I do, almost all of which want to phone home all the goddamn time. It's the metadata that's important here, so payload contents (including encryption) don't even matter.

        • graphememes 15 hours ago

          You're never going to win this argument, most of the people who post here have never actually shipped a product themselves and only work on isolated features and others have to handle / manage all of this for them so they have no real understanding of what it takes to do it

          the other crowd that pretends otherwise are larping or only have some generic open source project that only a handful of people use or they only update it every 6 years

          • embedding-shape 15 hours ago

            > You're never going to win this argument

            Probably because there is no "truth" here, only subjective opinion, there is no "winning", only "learning" and "sharing".

            I could ramble the same about how "people relying on data never shipped an enjoyable thing to people who ended up loving, only care about shipping as fast as possible" and yadda yadda, or I can actually make my points for why I believe what I believe. I do know what I prefer to read, so that's what I try to contribute back.

          • codedokode 14 hours ago

            You could hire people to be testers and pay them for the analytics, I think they would even allow you to record the screen if you paid well enough. The problem is that you do not want to pay or get consent, you want to grab the data for free and without permission and without people realizing what you do. And such kind of people deserve much worse treatment than they are treated today.

          • matheusmoreira 14 hours ago

            Nobody actually cares "what it takes to do it", that's not our problem. You're not entitled to knowing even a single bit of information about us without our consent. Try innovating a way to do it without spying on people.

        • 6r17 15 hours ago

          "You’d think telemetry is screen recording your every move" - that's literally what tracing and telemetry is about.

          "Sure, you can spend the weeks to months of expensive and time consuming work it takes to get a fuzzy, half accurate and biased picture of what your users workflows look like through user interviews and surveys. Or you can look at the analytics, which tell you everything you need to know immediately, always up to date, with perfect precision." -> your analytics will never show what you didn't measure - it will only show what you already worked on - at best, it's some kind of validator mechanism - not a driver for feature exploration.

          This kind of monitoring need to go through the documented data exposure - and it's a sufficient argument for a company to stop using github immediately if they take security seriously.

          But I'd add that if you take security seriously you are not on Github anyway.

          • johnfn 10 hours ago

            No, telemetry is not "literally" about screen recording. Telemetry is metrics. That is why they invented a new word for it rather than calling it "screen recording".

        • xigoi 14 hours ago

          Many products would be much better if they listened to what people are saying on public forums instead of using telemetry. For example, Google Maps has a longstanding bug where it auto-translates all reviews even if they are in a language you speak. If Google cared about user feedback, they could’ve easily fixed it, but no amount of telemetry will tell them this.

          • kelvinjps10 13 hours ago

            I hate this feature. Google knows the languages I speak because I added them in my account, even with all the tracking they obviously know, but they keep messing it up in all their products, Google Search, YouTube (they add machine audio translations to videos and translate the thumbnails).

            • Avamander 11 hours ago

              They even do it on Google Play. No, I don't want to buy books in a language I can't read, suggest me ones that I can. It's been like that for a decade now I think. I guess it doesn't make them lose a noticeable amount of money.

          • zerkten 6 hours ago

            The reality is that most product leaders only care about the feedback that has visible consequences. If users aren't performing some action like quitting the app that shows in the telemetry, then they aren't going to pay attention.

            They'd probably call the issue you see a "craft" issue. Some PM is likely raising it. What happens is that leaders in big companies want perspectives based on data. You can go in with issues like yours but if you don't have clear data that shows significant numbers of users leaving, or users piling in, then you might as well not show up. People care about craft primarily will really struggle in these large organizations. That's not a good thing but how it is.

            In large organizations, you'll see a lot of A/B testing or experimentation. Some of the worst decisions from a craft perspective are ones where they only look for "did this cause some kind of negative impact on numbers?" situation. If your feature is neutral (on abandons, uninstalls, or whatever negative outcome), then it can get shipped which overrides any qualitative question around "should we ship this in this state?". Doesn't matter too much according to these folks because it's not making things worse (in terms of numbers.)

            There is probably more to explore in modern "product management" that's at the root of many of these problems. HN tends to focus on engineering but within large companies there is now a bifurcation and development of a field that forgets lots of PM was already invented.

            • mort96 2 hours ago

              These kinds of issues cause a negative feeling towards the product in the user. They keep using the product even after having seen a badly auto translated review from a language they speak or all these other things, but they now have a little bit more resentment towards the product. It makes them a bit more likely, over time, to switch to a competitor. Maybe they vent to a friend a month later and the friend suggests giving Apple Maps a try.

              How do the metrics you speak of capture these subtle, delayed effects?

          • Ferret7446 2 hours ago

            What you're saying is exactly wrong. What people say on public forums is a very biased sample, the proverbial vocal minority

        • codedokode 14 hours ago

          Why do you need to collect hardware fingerprint, IMEI, phone number, geolocation, list of nearby wifi access points, list of installed applications, selfie and passport photo when you can simply count how much times a server route was called?

          • mynameisvlad 14 hours ago

            That's a slippery slope and we both know it. Telemetry does not automatically include those things.

            • Barbing 13 hours ago

              Indeed it's not fair in discussion context, so wonder if it was meant as a statement on the ills of telemetry as a whole.

          • johnfn 13 hours ago

            My comment explicitly uses "how many people clicked the secondary button on the third tab" as an example, not any of that nonsense -- you are not responding in good faith.

        • matheusmoreira 14 hours ago

          > From this thread you’d think telemetry is screen recording your every move

          > it’s more along the granularity of “how many people clicked the secondary button on the third tab?”

          You don't see the contradiction here?

        • paulddraper 12 hours ago

          > Sometimes HN drives me crazy.

          You can tell the difference between those who build businesses and those who simply use them.

        • johannes1234321 12 hours ago

          There are two aspects of that:

          1) Metrics lead to wrong conclusion. There is software which has extremely rarely used features, I need it once or twice a year only, but the ability is why I use the software to begin with. If metrics get too much attention such things are removed as being unimportant ...

          2) a lot of the tracking happening is way too intrusive and intransparent. There are valid use cases, however some large corporations especially, in the last had cases where they collected way too much, including private information, without really giving information about it. That overshadows good cases.

        • atoav 12 hours ago

          Yes, but the answer to "how many people clicked that button" is irrelevant if it describes the outside world. This id like concluding something is wrong with umbrellas because none of the users in the desert opened them.

          If the questions you have can be answered by simple telemetry you are likely asking the wrong questions. E.g. a confused user will click all the buttons, while one thst efficiently uses your software to solve a very specific problem may always ever press the ssme ones.

          The actually interesting questions are all about how your software empowers users to deal with the things they have to deal with. Ideally with as little buttons as possible. And if once a year they need that other button it will be there.

          It is very easy to draw the wrong conclusions from telemetry.

        • thwarted 8 hours ago

          > Or you can look at the analytics, which tell you everything you need to know immediately, always up to date, with perfect precision.

          Analytics do not tell you everything you need to know immediately. The analytics may say that no one is using a given feature, but they don't necessarily tell you why. Maybe they don't use it because they're not aware of it, marketing is presenting it wrong, or sales isn't selling against it. Maybe they've tried to use it and it doesn't work for them and they never tried it again. Maybe the call to action to bring them to it doesn't work or directs them wrong. Maybe it gets used by 1% of the users who happen to be power users. You might look at that 1% and conclude that it's not getting enough use to warrant supporting it or keeping it around.

        • skywhopper 6 hours ago

          The problem is that, without the context of actually talking to and observing users in the real world, software teams have repeatedly misinterpreted telemetry. Even the description of how they use telemetry to decide which features need investment or improvement shows this. In the face of huge data with no context, they make bad assumptions rather than talking to actual users. Over and over again.

      • Arch485 19 hours ago

        Exactly - purely "data driven" decisions are how we end up with ads really close to (or overlapping with) some button you want to press, because the data says that increase click-through rate! But it's actually a user-hostile feature that everyone hates.

        • PhoenixFlame101 18 hours ago

          But collecting data and looking for insights doesn't mean you mechanically optimize features, especially user-hostile ones? This is just as, if not more, likely to happen when basing your decisions on what people say they want over what they actually do.

          • defmacr0 17 hours ago

            If we were perfectly rational, then yeah, more data should never lead to worse decisions. However, it's easy to fall into the trap where being data-driven makes you only work on those things that you know how to measure.

        • mgfist 17 hours ago

          The reason that feature gets implemented is not because the devs think users will like it ... they know users don't want it, but it drives revenue and pays salaries.

      • staticassertion 16 hours ago

        It's sort of hilarious to compare "talking to people" with analytics. I'm not defending Github here, but you can't possibly think that "talking to 1M customers" is viable.

        • almostjazz 16 hours ago

          You could survey a representative sample

          • staticassertion 16 hours ago

            Not really. (a) People hate responding to surveys and hate emails, you're more likely to lose users than to get data (b) there's no way you're surveying people's in a way that gets you information like "time spent on a page" or "time between commits" or whatever.

            This is just nonsense tbh. Surveys and customer outreach solve completely different problems from analytics.

            • almostjazz 14 hours ago

              I agree you can't practically get the same information as you could with telemetry.

              Survey data is still real data that can be used for "analytics".

              Some people also hate telemetry. It feels invasive. I have a guess about what direction the percentage of consumers who hate telemetry is moving toward.

            • codedokode 14 hours ago

              You can hire people to test your product and provide analytics. But not try to siphon the data for free.

              • staticassertion 14 hours ago

                I'm not taking a side on whether a product should add telemetry. I'm rejecting the absurd notion that these suggestions are at all giving the same information.

        • xigoi 14 hours ago

          That’s what user forums are for.

          • staticassertion 14 hours ago

            You can set up a user forum if you'd like. If you think it will get you the same information that analytics will, you're obviously wrong.

            • xigoi 14 hours ago

              Kagi has a user forum (as well as listening to comments on other sites like Hacker News) and does not (at least supsosedly) collect telemetry. They seem to be doing fine when it comes to feedback.

      • dpark 11 hours ago

        > Don't take what people say at face value, but think about it together with your knowledge and experience

        While you’re comparing different information sources, you might even want to consider telemetry, too.

      • SchemaLoad 10 hours ago

        What people say, and what people do are different things. Especially when the people who agree to talk to you aren't representative of the whole user base.

    • ubercore 19 hours ago

      It makes me think, what `gh` features don't generate some activity in the github API that could as easily guide feature development without adding extra telemetry?

      • larusso 18 hours ago

        Yeah. Unless they plan to move more local git operations in the tool and blur the line between git and gh.

    • nkrisc 19 hours ago

      > The difference between what people tell you when asked directly and how they actually use your software is actually shocking.

      And the difference between what they do and what they want is equally shocking. If what they want isn’t in your app, they can’t do it and it won’t show up in your data.

      Quantitative data doesn’t tell you what your users want or care about. It tells you only what they are doing. You can get similar data without spying on your users.

      I don’t necessarily think all data gathering is equivalent to spying, but if it’s not entirely opt-in, I think it is effectively spying no matter what you’re collecting, varying only along a dimension of invasiveness.

      • DrScientist 19 hours ago

        > If what they want isn’t in your app, they can’t do it and it won’t show up in your data.

        Excellent point.

        > but if it’s not entirely opt-in, I think it is effectively spying no matter what you’re collecting, varying only along a dimension of invasiveness.

        Every web page visit is logged on the http server, and that's been the default since the mid 1990's. Is that spying?

        • nkrisc 18 hours ago

          In principle, yes, I believe it is a form of spying. Not particularly invasive nor harmful, but spying nonetheless.

          Logging every page visited is not a technical requirement of serving the requested resource.

          • vlovich123 18 hours ago

            > Logging every page visited is not a technical requirement of serving the requested resource.

            How will you know which page is having problems being served or is having performance problems?

            • nkrisc 18 hours ago

              You won’t, but that’s not what was asked.

              Logging the requested resource is not a technical requirement of serving that resource.

              • vlovich123 6 hours ago

                Depends how you define “technical requirement” but I’d say 404 for example is an indication of a failure to serve a given resource. If you don’t have logging you won’t know unless someone complains which means you’ll only catch the most visible issues. Same goes for performance - everywhere I’ve ever worked serving a resource was tightly coupled to “how fast can the user retrieve that resource”.

          • DrScientist 10 minutes ago

            But it's just tracking something the server was asked to do - I'd say it's legitimate logging.

            If you buy something at the supermarket, the supermarket keeps a record of the transaction - it's part of the process.

            However if you try and link that to entities and build a pattern behaviour across multiple websites then I think you stray into spying.

            Also if the tin of beans I bought at the supermarket records audio at home and uploads to the cloud - that's spying.

    • kodablah 19 hours ago

      > If you dont have analytics you are flying blind

      More like flying based on your knowledge as a pilot and not by the whims of your passengers.

      For many CLIs and developer tooling, principled decisions need to reign. Accepting the unquantifiability of usage in a principled product is often difficult for those that are not the target demographic, but for developer tools specifically (be they programming languages, CLIs, APIs, SDKs, etc), cohesion and common sense are usually enough. It also seems real hard for product teams to accept the value of the status quo with these existing, heavily used tools.

      • mckn1ght 16 hours ago

        Actually it's more like flying in the clouds with no instruments which can lead to spatial disorientation when you exit the cloud cover and realize you're nosediving towards the earth. https://en.wikipedia.org/wiki/Spatial_disorientation

        Flying based on the whims of your passengers would be user testing/interviewing, which is a complementary, and IMO necessary, strategy alongside analytics.

    • jubilanti 18 hours ago

      Wow, it really is sad how literally unthinkable it is to you and so much of the industry that you could actually talk to your users and customers like human beings instead of just data points.

      And you know what happens when you reach out to talk to your customers like human beings instead of spying on them like animals? They like you more and they raise issues that your telemetry would never even think to measure.

      It's called user research and client relationship management.

      • vlovich123 18 hours ago

        I think you’re overlooking that they were talking about stated and revealed preferences, a well known economic challenge where what people say is important to them and what shows up in the data is a gap. Of course you talk to users and do relationship management. That doesn’t negate the need to understand revealed preferences.

        In the OSS world this is not a huge deal. You get some community that’s underserved by the product (ie software package) and they fork, modify, or build something else. If it turned out to be valuable, then you get the old solution complemented or replaced. In the business world this is an existential threat to the business - you want to make sure your users aren’t better served by a competitor who’s focusing on your blindspot.

      • skeuomorphism 18 hours ago

        Marketing came to the conclusion that people dont know what they actually want. They decided to lump in engineers and programmers as well, since they started abusing their goodwill.

      • kalleboo 18 hours ago

        Apple and Microsoft reached their peak usability when they employed teams of people to literally sit and watch what users did in real life (and listen to them narrating what they want to do), take notes, and ask followup questions.

        Everything went to crap in the metric-based era that followed.

      • alexchantavy 18 hours ago

        The problem they're trying to solve is to find out what functions of their software are most useful for people and what to invest in, and to make directions on product direction.

        Yes, vendors can, do, and should talk to users, but then a lot of users don't like receiving cold messages from vendors (and some users go so far as to say that cold messages should _never_ be sent).

        So, the alternative is to collect some soft telemetry to get usage metrics. As long as a company is upfront about it and provides an opt-out mechanism, I don't see a problem with it. Software projects (and the businesses around them) die if they don't make the right decisions.

        As an open source author and maintainer, I very rarely hear from my users unless I put in the legwork to reach out to them so I completely identify with this.

        • pc86 18 hours ago

          If you have an existing financial relationship with someone it is by definition not a "cold message." People who think they should never, ever be contacted by a company they are paying to use a service of are in the extreme minority. That's "cabin in the woods with no electricity" territory.

      • Sytten 18 hours ago

        You are inferring your own perception based on my comment, no need to be an asshole here. Like I said elsewhere we do both and they serve different purpose. We also make is very clear and easy to disable in the onboarding. I hope you try to build a business sometimes and open up your perspectives that maybe just maybe you don't have all the answers.

        • TimorousBestie 16 hours ago

          > We also make is very clear and easy to disable in the onboarding.

          Yeah, sure. How long is that policy gonna last? How does a user even know that that checkbox does anything?

          Once you’ve decided to break a social contract it’s not like you can slap a bandaid on it and it’s all okay now.

          > I hope you try to build a business sometimes and open up your perspectives that maybe just maybe you don't have all the answers.

          People were building successful businesses long before the Internet.

        • keybored 14 hours ago

          > You are inferring your own perception based on my comment, no need to be an asshole here.

          People in this case are likely extrapolating based on how user data is harvested in the industry at large. So there is bound to be (very likely) some characterization that is unfair to you.

          Given modern data aggregation, really data vacuuming, and that software is opaque, it can be really hard to trust anyone with any aggregation of data. They say that they pseudonymize properly. The proof? Trust them bro. Then read yet another news article about how some data aggregation was either sloppily leaked or just a front for selling data.

          A natural response to opaque practices by people you don’t trust is a hardline no.

        • mcmcmc 12 hours ago

          You stated that you are blind without analytics, which heavily implies other forms of user research are useless and don’t provide meaningful signal. I don’t think an assumption that you’re not using other methods is that outrageous.

      • chao- 18 hours ago

        Customer interviews are an indispensable, high-value activity for all businesses. They are a permanent, ongoing capability that the organization must have. A conversation will surface things that analytics will not catch. People will describe their experiences in a qualitative manner that can inspire product improvements that analytics never will.

        However, the plural of "anecdote" is not "data". People are unreliable narrators, and you can only ask them so many questions in a limited time amid their busy lives. Also, there are trends which appear sooner in automated analytics by days, weeks, or even months than they would appear in data gathered by the most ambitious interview schedule.

        There is a third, middle-ground option as well: surveys. They don't require as much time commitment from the user or the company as a sit-down interview. A larger number of people are willing to engage with them than are willing to schedule a call.

        In my experience, all three are indispensable tools.

      • 7bit 16 hours ago

        Get off your high horse.

        Talking to users when you have hundreds of customers does no more than give you an idea of what those specific people need. If you have hundreds of users or more, then data is the only thing that reliably tells you these things.

    • pc86 18 hours ago

      You can "optimize a successful user journey" by making the software easy to use, making it load so fast people are surprised by it, and talking to your customers. Telemetry doesn't help you do any of that, but it does help you squeeze more money out of them, or find out where you can pop an interstitial ad to goose your ad revenue, and what features you can move up a tier level to increase revenue without providing any additional value.

    • a012 18 hours ago

      You have all info you need on server side, I don’t believe that you’re totally blind without client tracking

    • ryandrake 18 hours ago

      This got me thinking: Are there prominent examples of open source projects that 1. collect telemetry, 2. without a way to opt-out (or obfuscating / making it difficult to opt-out)? This practice seems to be specific to corporate software development.

      Why is it that startups and commercial software developers seem to be the only ones obsessed with telemetry? Why do they need it to "optimize user journeys" but open source projects do just fine while flying blind?

      • theplatman 17 hours ago

        open source projects are usually creating something for themselves so it's much easier to know what to build when you are the user

        whereas, commercial software has a disconnect between who are the users and developers are

    • e12e 18 hours ago

      I think there's room for a distinction between "not using metrics" and "not using data".

      Unthinkingly leaning on metrics is likely to help you build a faster, stronger horse, while at the same time avoiding building a car, a bus or a tractor.

    • yoyohello13 18 hours ago

      The totality of Microsoft's products is proof that this is false. If telemetry and analytics actually mattered for usability, every product Microsoft puts out wouldn't be good instead of garbage.

      • SchemaLoad 10 hours ago

        There are far too many factors to assign the quality of microsoft's products to telemetry.

        Having the data doesn't mean you will act on it. And doesn't mean microsofts interests are aligned with the users.

    • ctoth 18 hours ago

      > If you dont have analytics you are flying blind.

      We... we are talking about a CLI tool. A CLI tool that directly uses the API. A tool which already identifies itself with a User-Agent[0].

      A tool which obviously knows who is using it. What information are you gathering by running telemetry on my machine that couldn't.. just. be. a. database. query?

      Reading the justification the main thing they seem to want to know is if gh is being driven by a human or an agent... Which, F off with your creepy nonsense.

      Please don't just use generic "but ma analytics!" when this obviously doesn't apply here?

      [0]: https://github.com/cli/cli/blob/3ad29588b8bf9f2390be652f46ee...

    • renegade-otter 18 hours ago

      I agree with you in that regard. That said, knowing that this is Microsoft, the data will be used to extract value from the customers, not provide them with one.

    • tomrod 17 hours ago

      Teams that do this need to just dogfood internally. Once you start collecting telemetry on external users defaulted to opt-in you're not a good faith actor in the ecosystem.

    • goosejuice 16 hours ago

      You could, I don't know, do user interviews with the various customer segments that use your product.

    • Apylon777 16 hours ago

      How did GitHub ever survive without this telemetry? Was it a web application buried in obscurity?

    • lynndotpy 15 hours ago

      Game developers benefit tremendously from streams where they get to see peoples webcams _and_ screens as they use their software.

      This would be _absolutely insane_ telemetry to request from a user for any other piece of software, but it would be fantastically useful in identifying where people get frustrated and why.

      That said, I do not trust Microsoft with any telemetry, I am not invested in helping them improve their product, and I am happy not to rely on the GitHub CLI.

    • codedokode 14 hours ago

      Analytics is wrong. I never click any ads, but they keep showing it. I avoid registering or enter fake emails, but they keep showing full screen popups asking for email. I always reject cookies but they still ask me to accept them. And youtube keeps pushing those vertical videos for alternately gifted kids despite me never watching them. What's the point of this garbage analytics. It seems that their only goal is to annoy people.

      • sagarm 7 hours ago

        All of those are affected by analytics.

        Ad slots will be filled whether or not you click. If you never click, you'll tend to match with either very low quality ads or ads that pay per impression (display ads).

        Email registration is highly valuable for a business, so analytics won't be used to decide whether to show the modal but rather test different versions of it.

        Cookies are too valuable to not push on users, because without them only the previously mentioned low quality ads can be shown. High quality and display ads match on interest or demographic labels.

        The business decision to keep vertical videos is highly likely to be affected by analytics, and of course the choice of which videos to show is based on recommendation models trained on interaction logs.

        The priority isn't making your experience better, though that is often an incidental result -- it's driving the business.

    • attentive 13 hours ago

      It's not like they don't own API's that those cli's are hitting. They have all the stats they need.

    • sidkshatriya 13 hours ago

      > If you dont have analytics you are flying blind

      If you have too much emphasis on (invasive) analytics you might end up flying empty i.e. without customers.

    • chillfox 8 hours ago

      Be very careful with that.

      Analytics driven development easily leads to bad outcomes. 1. Important, but less frequently used feature gets moved to a hidden spot leading to even less usage leading to eventual removal. 2. Poorly functioning features not getting the improvement they need because few use them due to how poorly they function.

      I have seen these patterns a lot in software where decisions are based on analytics, and I usually stop using that sofware when I find a replacement.

    • qwertox 2 hours ago

      They could well use the data from their own developers, couldn't that be enough?

  • ForHackernews 19 hours ago

    Arguably yes. git has a terrible developer experience and we've only gotten to this point where everyone embraces it through Stockholm syndrome. If someone had been looking at analytics from git, they'd have seen millions of confused people trying to find the right incantation in a forest of confusing poorly named flags.

    Sincerely, a Mercurial user from way back.

  • dualvariable 19 hours ago

    > Is it not sufficient to employ good engineering and design practices? Git...

    Git has horrible design and ergonomics.

    It is an excellent example of engineers designing interfaces for engineers without a good feedback loop.

    Ironically, you just proved your point that engineers need to better understand how users are actually using their product, because their mental visualizations of how their product gets used is usually poor.

    • consp 18 hours ago

      Apparently I use git wrong since I do not feel this design and ergonomics issue.

      • halapro 16 hours ago

        How many years of experience with git do you have? How much of git do you use? I bet you use 5 commands and 10 flags at most. Take a look at git's docs

    • skydhash 16 hours ago

      > Git has horrible design and ergonomics.

      People say this and never has written about the supposed failure of design. Git has a very good conceptual model, and then provides operations (aptly named when you know about the model) to manipulate it.

      Most people who complains about git only think of it as code storage (folder {v1,v2,...}) instead of version control.

      • halapro 16 hours ago

        > never has written about

        If you don't want to look at what people write you can't say that they haven't written about it.

        > the supposed failure of design

        I don’t think people complain about the internals of git itself as much as the complexity of all the operations.

        If you want to read about complaints, you really don't have to look further than the myriad of git GUIs, TUIs and otherwise alternative/simplified interfaces.

        • skydhash 13 hours ago

          > I don’t think people complain about the internals of git itself as much as the complexity of all the operations.

          The complexity is only there when you want to avoid learning what you’re doing. Just like find(1) is complex if you don’t know stuff about the file system or sed(1) is complex if you don’t know regex and line based addresing of a text file.

          A lot of people who are using git don’t want to know what a commit is and their relation to branches. And then they are saying rebasing is too complex.

          > If you want to read about complaints, you really don't have to look further than the myriad of git GUIs, TUIs and otherwise alternative/simplified interfaces

          Git is a cli. The goal is always for you to find your workflow then create aliases for common operations. It does assume that you want complete control and avoid doing magic (which is what jj is doing).

          Wanting magic is great (I use magit which makes git magical ;) ) but it’s like wanting to fly a plane without learning the instruments.

  • dietr1ch 18 hours ago

    It's not the devs themselves, but the team/project/product management show that needs to pretend they are data driven, but then resort to the silliest metrics that are easy to measure.

  • rafram 18 hours ago

    > Would Git have been significantly better if it had collected telemetry

    Yes, probably. Git is seriously hard to use beyond basic tasks. It has a byzantine array of commands, and the "porcelain" feels a lot closer to "plumbing" than it should. You and I are used to it, but that doesn't make it good.

    I mean, it took 14 years before it gained a `switch` command! `checkout` and `reset` can do like six different things depending on how your arguments resolve, from nondestructive to very, very destructive; safe(r) operations like --force-with-lease are made harder to find than their more dangerous counterparts; it's a mess.

    Analytics alone wouldn't solve the problem - you also need a team of developers who are willing to listen to their users, pore through usage data, and prioritize UX - but it would be something.

  • 1vuio0pswjnm7 18 hours ago

    Perhaps the more interesting question is why these companies feel the need to "explain" why they are collecting telemetry or "disclose" how the data is used

    The software user has no means to verify the explanation or disclosure is accurate or complete. Once the data is transferred to the company then the user has no control over where it goes, who sees it or how it is used

    When the company states "We use the data for X" it is not promising to use the data for X in the future, nor does it prevent the company, or one of its "business partners", from using the data additionally for something else besides X

    Why "explain" the reason for collecting telemetry

    Why "disclose" how the data is used

    What does this accomplish

  • lo1tuma 18 hours ago

    I’m curious as well. Github is one of the rare products out there that get actual valuable user feedback. So why not just ask the users for specific feedback instead of tracking all of them.

  • mbreese 18 hours ago

    > I'm curious why corporate development teams always feel the need to spy on their users?

    This isn’t that surprising to me. Having usage data is important for many purposes. Even Debian has an opt-in usage tracker (popcon) to see wha packages they should keep supporting.

    What I’m curious about is why this is included in the CLI. Why aren’t they measuring this at the API level where they wouldn’t need to disclose it to anyone? What is done locally with the GH CLI tool that doesn’t interact with the GitHub servers?

  • _heimdall 18 hours ago

    Anonymous telemetry isn't necessarily spying, though "pseudoanonymous" sounds about as well protected as distinguishing between free speech and "absolutism." Github also wouldn't be tracking git use here, but the `gh` CLI that you don't need to install.

    All that said, having been in plenty of corporate environments I would be surprised if the data is anonymized and wouldn't be surprised if the primary motivator boils down to something like internal OKRs and politics.

  • chrishill89 17 hours ago

    Git relatively recently got an `--i-still-use-this` option for two deprecated commands that you have to run if you want to use them. The error you get tells you about it and that you should "please email us here" if you really am unable to figure out an alternative.

    I guess that's the price of regular and non-invasive software.

  • naikrovek 17 hours ago

    I'm curious why people think this is in the same ballpark as that something like a private investigator can do. This isn't spying at all.

    "oh no, they're aware of someone at the computer 19416146-F56B-49E4-BF16-C0D8B337BF7F running `gh api` a lot! that's spying!"

  • poulpy123 17 hours ago

    The current IA boom is entirely based on data . The more data you have the more you can train and the more money you make

  • rienbdj 17 hours ago

    When allocating engineering spend you need to predict impact. If you know how features of GitHub CLI are used and how you can do this more easily.

  • KronisLV 17 hours ago

    > I'm curious why corporate development teams always feel the need to spy on their users?

    Cause the alternative is viewing all of your app as one opaque blob - you don't know exactly how it's being used, which features actually need your attention, especially if you're spread thin. If you're in consulting or something like that and the clients haven't let you configure and/or access analytics (and the same goes for APM and log shipping), it's like flying blind. Couple that with vague bug reports instead of automated session recording and if you need to maintain that, you'll have gray hairs appearing by the age of 30.

    Take that disregard of measurement and spread it all across the development culture and you'll get errors in the logs that nobody is seeing and no insights into application performance - with the system working okay at a load X, but falling over at X+1 and you having to spend late evenings trying to refactor it, knowing that it needs to be shipped in less than a week because of client deadlines. Unless the data is something that's heavily regulated and more trouble than it's worth, more data will be better than less data, if you do something meaningful with it.

    > Would Git have been significantly better if it had collected telemetry, or would the data not have just been a distraction?

    Knowing the most common fuck ups and foot guns might inform better CLI design. Otherwise people saying that it's good have about as much right to do so as saying that it's bad (at least in regards to UX), without knowing the ground level truth about what 90% of the users experience.

    • skydhash 16 hours ago

      > you don't know exactly how it's being used, which features actually need your attention, especially if you're spread thin.

      Why not conduct a survey?

      > vague bug reports instead of automated session recording and if you need to maintain that, you'll have gray hairs appearing by the age of 30.

      If it's a customer, why not reach directly to him?

      > with the system working okay at a load X, but falling over at X+1 and you having to spend late evenings trying to refactor it,

      No one is talking about telemetry on your servers. We're talking about telemetry on client's computers.

      • KronisLV 12 hours ago

        > Why not conduct a survey?

        Large amounts of time before getting feedback, low percentage of people responding, not an accurate sample of all users (you will get signal from the loudest ones) and inaccurate data (biases in perception) instead of measurable reality. Not useless, but not a full replacement for telemetry.

        > If it's a customer, why not reach directly to him?

        Layers of indirection (and also the slowness of getting through them). You might not control the processes and approvals needed to do that in a non-startup environment. You will probably control enough of the app to add various technical solutions to aid you in collecting information.

        > No one is talking about telemetry on your servers.

        I am. Culture of not collecting client side data also often comes together with a culture of not collecting server side data properly either. Competent teams will evaluate both. My argument is that all of this data can meaningfully help in development and that other approaches don't replace it well enough.

        • skydhash 9 hours ago

          > Large amounts of time before getting feedback […]inaccurate data (biases in perception) instead of measurable reality.

          I think perception is more valuable because that is the only way of measuring frustration with the UX of the software. Something may be used a lot, but is painful to everyone.

          > You might not control the processes and approvals needed to do that in a non-startup environment

          In such environment, it’s often true that features are governed by third parties as well. You collect telemetry, but have no say in what user will get and experience.

          > Culture of not collecting client side data also often comes together with a culture of not collecting server side data properly either

          I strongly doubt that. We have way more tools for collecting data on servers than on doing telemetry.

          ——

          As a user, I essentially want my software to stay the same as long as possible. If I really need a new feature or am struggling with a bug, that’s when I contact the developers. If the developers are dogfooding their own software, new feature can be quite a delight especially when their profile is similar to yours.

          But telemetry driven development is how you get features no one asked for.

          • KronisLV 1 hour ago

            Very reasonable arguments!

            > As a user, I essentially want my software to stay the same as long as possible. If I really need a new feature or am struggling with a bug, that’s when I contact the developers.

            I largely agree and think that if the world had more users like you, it would be a better place, with more stable software. Unfortunately, in any industry with competing solutions, your users probably won't go through all that trouble and will just pick another product, alongside companies having all sorts of incentives to ship stuff (sometimes the deadlines being made up) and therefore needing to make informed decisions about what to build and what to maintain ASAP. I think we'll get features nobody asked for, telemetry or no telemetry. As for everything else, it depends.

  • Lammy 16 hours ago

    The people who write any individual feature want to be able to prove usage in order to get good performance reviews and promotions. It's so awful that it's become normalized. Back in The Day we had the term “spyware” to refer to any piece of software that phoned home to report user behavior, but now that's just All Software.

  • rprend 16 hours ago

    Product work can be counterintuitive. An engineer / PM might think that a design or feature “makes sense”, but you don’t actually know that unless you measure usage.

  • bastardoperator 16 hours ago

    You have three features, A, B, and C. They are core features. Two of the features break. How do you prioritize which feature gets fixed first? With telemetry its obvious, without it, you're guessing.

    Also, gh cli is not about git, its about the github api. In theory the app has its own user agent and of course their LB is tracking all http requests, so not anonymous ever.

  • stronglikedan 16 hours ago

    > always feel the need to spy on their users?

    If it's truly pseudoanonymous then it's hardly spying, just sayin'...

    Others have answered your actual question better than I could have.

  • teeray 15 hours ago

    Because dashboard need to show number go up

  • high_na_euv 14 hours ago

    git is terrible from ux perspective

    >Would Git have been significantly better if it had collected telemetry, or would the data not have just been a distraction?

    Definitely

  • sidkshatriya 13 hours ago

    > ...our team needs visibility into how features are being used in practice. We use this data to prioritize our work and evaluate whether features are meeting real user needs.

    You should be able to see what features are being used by seeing what server endpoints are being hit and how often. Don't need intrusive telemetry. Yes, it's not perfect. Many features could use the same endpoint. You could totally anonymise this and you could still get a great understanding what features users are using by looking at endpoint stats.

    Companies need to decide whether they want customer goodwill or very detailed insight into what they are doing. By having invasive telemetry you may have less customers (people leaving for github competitors). Is it worth it ?

  • ravenstine 12 hours ago

    "Real" numbers make it easier for them to lie to leadership who then use or ignore those lies to justify decisions they were already going to make in spite of users.

  • paulddraper 12 hours ago

    To state the obvious, "good engineering/design practices" will not tell you what features are used or not.

    > Git has served us well for 20+ years

    Funny. I think that, but the usual HN narrative is that Git is UX hostile.

  • caymanjim 9 hours ago

    The language used in statements like this always annoys me. "We want visibility", ok fine. "We need"...the hell you don't.

  • crubier 4 hours ago

    Come on, do you actually think "corporate engineers" care about what you are doing individually? Do you think they look specifically for you and make fun at your individual usage pattern? Do you think it gives them interesting information about your private life?

    No one cares, there are millions of users, no one is going to look at your data, and even less be able to actually know which person a given user is.

    We legit just want to know aggregated and objective information about how people use our products so we can make it better for you.

    "Why do all corporate try to spy on our usage patterns?" Because the ones who don't have a crap product and all died long ago.

  • raxxorraxor 4 hours ago

    Especially true because it has also been proved that these telemetry tools often picture a distorted reality.

    My "telemetry" check is Windows. I does the same, forced it on users and it certainly didn't get better from it. The correlation with quality decline might be independent, but I would wager some shitty manager put up a shitty metric, everything got optimized towards that and leadership has lost the bigger picture because these numbers cannot transport general understanding.

ryanshrott 18 hours ago

> you're going to have to opt out of a lot more than this one setting

The opt-out situation for gh CLI telemetry is actually trickier than it sounds. gh runs in CI/CD pipelines and server environments where you may not want any outbound connections to github.com at all, not because of privacy but because of networking constraints. In those environments, the telemetry being on by default means your CI fails or your Bastion host can't reach GitHub at all.

Compare this to git itself, which is entirely local until you explicitly push. The trust model is different: git will never phone home unless you configure it to. gh, being a wrapper around the GitHub API, has to make those calls to function - but that's separate from whether it should also be collecting and uploading your command patterns.

  • tensegrist 16 hours ago

    > In those environments, the telemetry being on by default means your CI fails or your Bastion host can't reach GitHub at all.

    i'd be surprised if the inability to submit telemetry is a hard error that crashes the program

    • rtpg 10 hours ago

      There are definitely slightly annoying variants of this of "ah the program does its job in 200ms but takes 5s to shutdown timing out trying to send telemetry data". Especially annoying on CLI programs.

      I have been unpleasantly surprised by several programs outright crashing when not being able to send telemetry data consistently. Though this has usually been when the connection is a bit odd and it is able to send through _some_ stuff but then crashes when it fails later.

      • kippinsula 5 hours ago

        ran into this flavor once with a different tool, not gh. our deploy job was consistently about 8s longer than it should've been, turned out a fire-and-forget telemetry POST wasn't actually fire-and-forget when the endpoint got slow. NO_PROXY plus blackholing the host fixed it, but probably the kind of thing you shouldn't have to find via flame graph.

  • hahn-kev 16 hours ago

    Isn't the gh CLI useless if it can't connect to GitHub.com? Or does it work with enterprise GitHub and that's the use case you're talking about.

    • pledg 13 hours ago

      It does work with enterprise instances

CMay 19 hours ago

If you have 3 of your developers spending 80% of their time in an area of the codebase that gets no usage and you don't see a path forward that realistically is likely to increase usage, it can be a better use of developer time to focus them elsewhere or even rethink the feature.

The problem I have with a lot of these analytics is that while there are harmless ways to use it, there is this understanding that they could be tying your unique identifier to behavioral patterns which could be used to reconstruct your identity with machine learning. It's even worse if they include timestamps.

Why not just expose exactly what telemetry is being sent when it's sent? Like add an option that makes telemetry verbose, but doesn't send it unless you enable it. That way you can evaluate it before you decide to turn it on. Whenever you do the Steam Hardware survey it'll show you what gets sent. This is the right way to do it.

embedding-shape 21 hours ago

Love it when a PR is brief: https://github.com/cli/cli/pull/13254

> Removes the env var that gates telemetry, so it will be on by default.

  • kevincox 20 hours ago

    Not only on by default, it also isn't possible to disable it seems. It's forced on (other than enterprise it seems)

    • dfc 20 hours ago

      There is a "How to opt out" section in TFA.

bakies 20 hours ago

So happy I deployed gitea to my homelab last month. It's got an import feature from github and honestly just faster and better uptime that github. Claude can use it just fine with tea cli and git. It's pretty much a knockoff github, but I think it's better so far.

  • huijzer 20 hours ago

    I’m running Forgejo which has the same core code and yeah it’s amazing. Faster and better uptime indeed. It even works when my internet goes down because it’s on a Pi 4 here in the cabinet next to my desk Backups are done with borg and syncthing to offsite location. It takes a bit of work setting it up but after that maintenance time is near zero. I just manually SSH in once every two weeks to check SSD space, RAM usage and run apt update and upgrade, and major version bumps

ImJasonH 20 hours ago

Do people think that GitHub isn't already collecting and aggregating all the requests sent to their servers, which is after all the entire point of the gh CLI?

If you don't want your requests tracked, you're going to have to opt out of a lot more than this one setting.

  • pixel_popping 19 hours ago

    Data is on their server, so obviously they are already doing it, they just want to increase tracking by knowing what transit as well to Gitlab, Codeberg and such by having additional client-side metrics.

    • ImJasonH 19 hours ago

      I did not get that impression from these docs or from a brief look through the gh CLI codebase. Can you point to evidence that makes you believe this is used to collect metrics about requests to other services?

raxxorraxor 3 hours ago

I would very much recommend Gitea here. It is just a better GitHub. Can be integrated in corpo networks just fine, you can have ci/deployment pipeline you completely control.

Sure, if you want your repos to be public, you need a host. But honestly they aren't too pricey for offering code, even if prices are currently very high.

grugdev42 20 hours ago

Remember that thing Microsoft does?

Embrace, extend, extinguish.

The first two have been done.

I give it five years before the GH CLI is the only way to interact with GitHub repos.

Then the third will also be done, and the cycle is complete.

  • square_usual 18 hours ago

    > I give it five years before the GH CLI is the only way to interact with GitHub repos.

    I'll take that bet. How much are you willing to put on it?

  • naikrovek 17 hours ago

    people that say things like this are exhausting. exhausting. You make it so very easy to classify you straight into the "looney" bin. People said that WSL was EEE for Linux. when that didn't happen, people said that WSL gaining GPU support was EEE. When that didn't happen, people said that WSLg was EEE for Linux. People said that Powershell was EEE for Windows.

    None of these happened. none of them even appear to have happened, and none of them appear to have even been planned. It's all a hallucination by people that talk like this. It's all imaginary. Show me any evidence of anything like this. ANY AT ALL. Not a hunch, not something that could be interpreted that way, show me the very clear and repeatable steps that Microsoft used in the 90s to EEE something in anything they're doing today.

    They're too busy copiloting everything and arguing with each other to do this. Show me Microsoft Git with extra features over the open source version. Show me Microsoft Linux with extra features over the open source version. Show me Microsoft ANYTHING with extra features over the open source version they copied, and show me the doors slowly closing behind me. You can't. Because it isn't happening.

    git repos can't be locked up in the way you're describing. github is a wrapper around git. it would take an enormous amount of work for microsoft to change this fundamental decision in the design of github. GitHub is a git server, over both HTTP and SSH. These are core decisions of the software that everything else sits on top of. If pulling git repos over HTTP or SSH ever stops being supported, so many things are going to stop being supported, that it just won't be useful at all after that point.

    the gh cli makes api calls, that's all. it just makes api calls easier. it exposes a lot of api calls as friendly commands. it's not something that most of github users are even aware of, much less use. gh is not going to lock someone into a github "ecosystem" A) because such a thing doesn't exist and B) again, most people don't use it.

    Microsoft is far more likely to kill GitHub because of people with MBAs (aka "morons") who somehow call the shots in business these days. They are not going to pilot into the ground by EEE. They are going to pilot it into the ground because they don't know what they're doing, and they don't know what users want or what they like. That will be the fate of GitHub; incompetence will kill it, not cold, calculating, nefarious competence.

    • waisbrot 15 hours ago

      I think the down-votes on this comment are too bad. It's legitimately funny to write a muli-paragraph rant in high dudgeon calling other people "exhausting".

      • naikrovek 9 hours ago

        If reading exhausts you, don’t read it.

        The comment’s size is apparent before reading a single word, so you can avoid it if it is too large. “EEE” comments are short and exhausting and there is no warning visible.

  • jmclnx 16 hours ago

    >I give it five years before the GH CLI is the only way to interact with GitHub repos.

    I do not doubt this, already it seems to be a pain to deal with some repos on github without using gh. I do not know what gh buys you but I have never used it so I do not know if it is "better". To me the standard git commands are fine. But yes, I think the trend to forcing gh upon us is already being done.

    • zzo38computer 15 hours ago

      I do use a command-line program as the only way to interact with GitHub (using the GitHub API), but I do not use GH CLI; I have my own implementation (which is much smaller than theirs). (They can see that I use my own, because of the User-Agent header, and they can also see what APIs are accessed.) (Git can also be used, but only for the functions of Git rather than the functions of GitHub.)

ConceptJunkie 18 hours ago

Do they mean "pseudonymous" telemetry meaning "non-identifying telemetry", or do they mean "pseudoanonymous" telemetry meaning telemetry is that not really anonymous?

Those two words have almost exactly opposite meanings, and as stated, they are literally saying they are collecting identifiable data.

  • naikrovek 17 hours ago

    it means they can see all the telemetry from a single machine, but the identity of the machine is not tied to any human identity or github account. each machine appears to get its own UUID and that's how they "identify" machines.

  • layer8 15 hours ago

    The page only uses the term “pseudonymous”. “Pseudoanonymous” seems to be an invention of the HN submitter.

mrt181 3 hours ago

We need actual usage data. Meanwhile users use frontends on top of gh-cli to not be bothered using gh-cli

goosejuice 16 hours ago

This should be opt-in. Force their employees to opt-in if they want. That's plenty of data to make informed decisions.

ZetsuBouKyo 5 hours ago

One day, I hope to see servers walking around with cameras, recording customers' reactions to the food for the chefs' feedback.

mghackerlady 20 hours ago

can someone explain why github has a CLI? why wouldn't you just use git?

  • bakies 20 hours ago

    PRs, and managing repos, and other things that aren't git features. You can use it to auth with GITHUB_TOKEN instead of ssh or http. Which is how my agents get access. I've switched to gitea, it's got all the same features.

    • mghackerlady 18 hours ago

      ah, that's probably why I've never had any use for it. I don't really contribute to any large open source projects and prefer the sourcehut/lkml style of using git

  • chris_money202 20 hours ago

    gh is insanely powerful, especially if you let your coding agent use it. It’s one of my top tools. Gh lets you use GitHub features such as issues, pull request, reading CI pipelines, creating CI pipelines, etc. git is just for code version control.

  • Atreiden 20 hours ago

    Creating PRs, reading PRs, creating/reading Issues, triggering actions, to name a few

  • none2585 20 hours ago

    My last job they used gh features heavily - pull requests, issues, and gha most of all. So having the cli made automating (or interacting with agents) github-specific tasks possible.

  • cerved 20 hours ago

    You use gh to interact with the forge, git to interact with the repo.

    For example

      gh pr checks --watch
    

    will run and poll the CI checks of a PR and exit 0 once they all pass

  • koito17 19 hours ago

    At my current job, I sometimes set up a Nix shell with the GitHub CLI, since that let's Claude Code associate a feature branch to a pull request. The LLM can then retrieve PR description, workflow results, review comments, etc.

    Also, I believe GitHub Actions cache cannot be bulk deleted outside of the CLI. The first time I [hesitantly] used the gh CLI was to empty GitHub Actions cache. At the time it wasn't possible with the REST API or web interface.

brown9-2 9 hours ago

What’s confusing about this is that every gh command is just a wrapper around their API.

layer8 15 hours ago

*pseudonymous

The article doesn’t use the word “pseudoanonymous”, only “pseudonymous”.

tornikeo 19 hours ago

Good for GitHub. All companies need this. Some use it to improve products, some use it for less commendable goals. I know HN crowd is allergic to telemetry but if you've ever developed a software as a service, telemetry is indispensable.

  • isoprophlex 18 hours ago

    God forbid you talk to your users

    • tornikeo 13 hours ago

      Talking is a must. But just like quantum particles, users behave and talk very differently. Just look at gamers - most of them say they _hate_ AI in games, yet they are actively behaving differently, buying games made with AI, using AI, etc.

      • user3939382 9 hours ago

        It’s okay if I spy on you without your consent, it’s for your own good. Or my own good. Something like that, is that your point? The ends justify the means? How about respect as a feature, that one you don’t need telemetry to determine.

  • Citizen_Lame 18 hours ago

    All AI bros are the same.

    P.S. You look like villain from Temu.

  • xpe 18 hours ago

    Thinking out loud: what are the best practices to vet a tools' telemetry details? The devil is in the details.

    A quick summary of my Claude-assisted research at the Gist below. Top of mind is some kind of trusted intermediary service with a vested interest in striking a definable middle ground that is good enough for both sides (users and product-builders)

    Gist: WIP 31 minutes in still cookin'

    • a_t48 16 hours ago

      Hey, please don't blindly paste/post from LLMs, please.

      • xpe 7 hours ago

        I appreciate the "please", but this comes across as presumptive. First, you don't know the effort level I put in. Second, you haven't seen the end result. Third, why do you think I would "blindly paste" from an LLM? If you take a look at my profile or other comments, I hope that is clear.

        I appreciate feedback in general, and I am glad when people care about making HN a nice place for discussion and community. Sometimes a well-meaning person goes a little too far, and I think it happened above. That's my charitable interpretation. It is also possible that in this age of AI, people are understandably pissed and sending that frustration out into the world. When that happens, just remember the people reading it matter too.

        About me: I would not share something unless I think it has value to at least one other person on HN. I've done a lot of work about data and privacy in general (having worked at a differential privacy startup in the past), but I'm much newer to the idea of digging into ways of making telemetry gathering more transparent. I haven't found great resources on the Web about this yet, which is why I started doing the research. And I'm going to share it for others to read, criticize, build on top of, etc.

        • a_t48 5 hours ago

          Where is the gist? I assumed LLM/bot because of the disconnect between "here's a gist" and "still cookin"

          • xpe 4 hours ago

            I ask everyone to be a bit more careful about the "assume LLM/bot" thing. That hair-trigger is often counterproductive.

            Anyhow, the Claude research took 36 minutes to run, so I put it to the side and didn't link it originally. I'm still thinking through it -- there is a lot to cover : https://gist.github.com/xpe/654af2731d40a145e1d0b8b694fe8fd3

  • Banditoz 16 hours ago

    GitHub CLI is not a SaaS. It's a commandline utility.

    • mynameisvlad 14 hours ago

      That doesn't mean it doesn't have usage patterns or other things telemetry would be useful for. And, at the rate these tools are being updated (multiple times a week, multiple times a day in some cases), they practically _are_ SaaS.

azalemeth 11 hours ago

One thing I noticed is that github's "clone" drop down menu used to include instructions on how to do so via git git (i.e. `git clone /path/to/repo`). Now there are only instructions provided by default for `gh`, their client.

I can't help but guess if these are related.

ptx 16 hours ago

Well, that validates my decision not to install it. Of course Microsoft will eventually abuse any trust you place in them and any access you give them. They always do. Don't let Microsoft run code on your machine and don't give them your data.

Kim_Bruning 19 hours ago

dev tools and especially libraries must not have telemetry unless absolutely strictly necessary (and even then!).

* Dev tools because you need to be able to trust they don't leak while you're working. Not all sites/locations/customers/projects allow leaks, and it's easier to just blacklist anything that does leak, so you know you can trust your tools, and the same habits, justfiles, etc work everywhere.

* libraries that leak deserve a special kind of hell. You add a library to your project, and now it might be leaking without warning. If a lot of libraries decide to leak, your application is now an unmanageable sieve.

If you do need to run telemetry, make it opt in or end user only. But if you as developer don't even have control then that's the worst.

Kim_Bruning 21 hours ago

what's the last version before telemetry... will want to pin there.

  • herpdyderp 20 hours ago

    According to their releases page: 2.90.0

traceroute66 20 hours ago

I suggest anyone who cares, and certainly anybody in the EU mails privacy@github.com and also opens a support ticket to let them know exactly what you think

  • binaryturtle 19 hours ago

    Wouldn't telemetry solve this problem automatically? I mean: they should get some signal back when people opt-out no? :)

lukewarm707 18 hours ago

#Telemetry FUCK OFF export DOTNET_CLI_TELEMETRY_OPTOUT=1 export ASTRO_TELEMETRY_DISABLED=1 export GATSBY_TELEMETRY_DISABLED=1 export HOMEBREW_NO_ANALYTICS=1 export NEXT_TELEMETRY_DISABLED=1 export DISABLE_ZAPIER_ANALYTICS=1 export TELEMETRY_DISABLED=1 export GH_TELEMETRY=false

  • throwaranay4933 17 hours ago

    Also:

      # Atlas
      export DISABLE_TELEMETRY=1
      # CloudFlare
      export WRANGLER_SEND_METRICS=false
      export VERCEL_PLUGIN_TELEMETRY=off
      # AWS
      export SAM_CLI_TELEMETRY=0
      export CDK_DISABLE_CLI_TELEMETRY=true
      # ???
      export DO_NOT_TRACK=true
minraws 18 hours ago

Fuck Github man, Fuck em'. I mean what even is the point. You lost the AI whatever it was, build a good product and features for developers like you tried to once.

And less social media shit, maybe adding better LFS alternative similar to huggingface and stuff.

Git isn't the popular choice in game dev because of this assets in tree hosting nonsense, why haven't we fixed it yet.

Similarly many edge cases, also finally they built stacked prs but man does it feel a under baked, and what it's like 2+ years late.

Please just improve Github, make me feel like I will be missing out if I am not on Github because of the features not because I have to be because of work.

natas 16 hours ago

Soon there will be ads on Github, you'll see.

lo1tuma 18 hours ago

This wouldn’t have happened with Nat Friedman.

NietTim 18 hours ago

There is no such thing as "pseudoanonymous" it's not a thing it does not exist it's an oxymoron.

sureglymop 15 hours ago

Why does github need a CLI? Some people just seem to want to make their own lives harder...

Datagenerator 18 hours ago

Don't confuse GIT(1) the protocol with this (keep in active memory the EEE tactics).

0x3o3 20 hours ago

just use Radicle and never look back with centralised platforms.

sammy2255 20 hours ago

Today I learned GitHub has a CLI. I guess that's like Pornhub having a CLI

  • embedding-shape 20 hours ago

    Before GitHub had a CLI, I used cURL (via zsh alises/functions) to open PRs and find what remote/branch a PR is associated with.

    Today I use a Golang CLI made with ~200K LOC to do essentially the same thing. Yay, efficiency?

  • falcor84 20 hours ago

    Seeing how annoying their website interfaces are, I'd actually be open to paying for API/CLI access to porn.

  • chris_money202 20 hours ago

    Gh cli is one of the most powerful tools you can give a coding agent imo

Henchman21 13 hours ago

Doesn't the github cli only utilize their API? As in, the tool is useless without that API? So couldn't they analyze the API instead?

shevy-java 15 hours ago

Microsoft really wants people to stop using GitHub.

Hopfully the codeberg people can improve their UI - the UI is the single reason I still use github (and filing issues is super-simple). I could never handle gitlab because I hate their UI.

Note that GitHub is in the news in the last some months, negatively. I think we are seeing first wear-and-tear signs. If people are smart, this would be a time where real competition to Microsoft GitHub could work. GitHub without users would be dead. Microsoft seems unable to care - they sold their soul to AI. It is make-it-or-break-it for them now.

msla 20 hours ago

I wonder how robust they are against people sending them fake data.

slackfan 15 hours ago

pseudoanonymous means not anonymous. /thread

m3kw9 15 hours ago

aka, you "anonymous" code will now be trained on our next coding models.

greatgib 19 hours ago

The current century is the one of enshitification, like a cancer, now there is a whole generation of PM that it is totally ok and legitimate to update your product to add spying of your user's usage.

It might seems legit from them, but I'm quite sure that just listening to your user is enough. It is not like they lack an user base ready to interact with them or that they lack of bugs or features to work on.

In most cases, the telemetry is more a vanity metric that is rarely used. "Congratz to this team that did the flag that is the most used in the cli". But even for product decision, it is hard to extract conclusions from current usage because what you can and will do today is already dependent on the way the cli is done. A feature might not be used a lot because it is not convenient to do, or not available in a good way compared to an alternative, but usage report will not tell if it was useful or not. In the same way, when I buy a product, often there are a lot of features that I will never use, but that I'm happy to have. And I might not have bought the product, or bought another one if it was not available. But the worse would have the manufacturer remove or disable the feature because it is not used...

raverbashing 19 hours ago

Do you know what doesn't collect telemetry?

the old git command in your terminal

I think I'll keep using that

wild_pointer 20 hours ago

pseudoanonymous, meaning not anonymous? lol

  • inetknght 20 hours ago

    Yes, that's exactly what pseudoanonymous means. It's fake-anonymous. It can be trivially de-anonymized.

varispeed 20 hours ago

pseudoanonymous = euphemism for not anonoymous.

Regulators should wake up and fine them hard, so hard to become existential. Make an example for others not to follow.

  • xpe 18 hours ago

    Being a good regulator is about solving a nearly impossible satisficing problem. You have to follow the law and achieve achieve results with a limited budget and political constraints. Given the priorities of say the FTC or state AGs or the SEC, I don't think GitHub is even a blip on their radar. Of any of the regulators I would hazard to guess that maybe the California Privacy Protection Agency is the most likely to prioritize a look, but I still doubt it.

    I know lots of idealists -- I went to a public policy school. And in some areas, I am one myself. We need them; they can push for their causes.

    But if you ever find yourself working as a regulator, you'll find the world is complicated and messy. Regulators that overreach often make things worse for their very causes they support.

    If you haven't yet, go find some regulators that have to take companies all the way to court and win. I have know some in certain fields. Learn from them. Some would probably really enjoy getting to talk to a disinterested third-party to learn the domain. There are even ways to get involved as a sort of citizen journalist if you want.

    But these sort of blanket calls for "make an example of GitHub" are probably a waste of time. I think a broader view is needed here. Think about the causal chain of problems and find a link where you have leverage. Then focus your effort on that link.

    I live in the DC area, where ignorance of how the government works leads to people walking away and not taking you seriously. When tech people put comparable effort into understanding the machinery of government that they do into technology, that is awesome. There are some amazing examples of this if you look around.

    There are no excuses. Tech people readily accept that they have to work around the warts of their infrastructure. (We are often lucky because we get to rebuild so much software ourselves.) But we forget what it's like to work with systems that have to resist change because they are coordination points between multiple stakeholders. The conflict is by design!

    Anyhow, we have no excuse to blame the warts in our governmental system. You either fix them or work around them or both.

    The world is a big broken machine. Almost no individual person is to blame. You just have to understand where to turn the wrench.

deathanatos 18 hours ago

I mean, make sense, of course. How else could they possibly know what users want? Run a bug tracker? Use their own software? Have more than one 9 of uptime? /s

Corporations can and will do every scummy thing permitted to them by law, so here we are. Until the US grows a backbone on issues of privacy, we shouldn't be surprised, I suppose. But the US won't be growing such a backbone anytime in the near future.

  • moi2388 18 hours ago

    Use their own software? Microsoft?!

neobrain 21 hours ago

tl;dr for opt-out as per https://cli.github.com/telemetry#how-to-opt-out (any of these work individually):

export GH_TELEMETRY=false

export DO_NOT_TRACK=true

gh config set telemetry disabled (starting from version 2.91.0, which this announcement refers to)

  • NeckBeardPrince 20 hours ago

    > gh config set telemetry false > ! warning: 'telemetry' is not a known configuration key

    What's strange is if you check your `~/.config/gh/config.yml` it will put `telemetry: disabled` in there. But it will put anything in that `config.yml` lol.

    > gh config set this-is-some-random-bullshit aww-shucks > ! warning: 'this-is-some-random-bullshit' is not a known configuration key

    But in my config.yml is

    this-is-some-random-bullshit: aww-shucks

  • nottorp 20 hours ago

    ... don't forget to recheck this info every update, restore flags that have been "accidentally" reset and set any new flags that they added for "different" telemetry

jw_cook 18 hours ago

TL;DR:

    gh config set telemetry disabled
bugrasan 20 hours ago

doesn't this need to be opt-in according to EU GDPR?

djdillon 20 hours ago

FWIW, looks to remain disabled by default for enterprise users.