Copilot Autofix, a new addition to the GitHub Advanced Security service, analyzes vulnerabilities in code and offers code suggestions to help developers fix them.

  • Gladaed@feddit.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 months ago

    True, but unrelated. Llms aren’t sentient. They are just a useful tool at times.