Blog
Working Papers

Working Paper 04: Benefit / risk assessment of using GenAI coding tools

Apr 2, 2024
3
min read
Share
X

Executive Summary

  • Sema estimates a 41X ROI over two years from adopting GenAI coding tools such as GitHub Copilot.
  • We’ve identified four risks of using GenAI coding tools.
  • With proper prevention and mitigation planning, the risks are manageable.
  • Next Steps: Sema recommends that organizations work with Counsel to determine if GenAI coding tools should be permitted. If yes, then the proper tools and usage should be adopted. If not, then usage should be prevented.

GenAI coding tools benefit developer satisfaction and engineering productivity

Sema AI Working Paper 01- High ROI AI Activities, from which this section was excerpted, is available upon request.
We estimate that the Return on Investment for software engineers using GenAI tools can be 41X over two years.

  • This calculation is based on:
  • Developer productivity increase of at least 10%, which Sema considers a minimum achievable level.
  • Cost of Enterprise-grade licenses.
  • Low implementation costs— developers learn GenAI tools quickly.

The four risks from GenAI coding tools are manageable

Risks from GenAI usage can be grouped into four categories:

  • Data leakage: Users put proprietary / confidential information into public or less secure GenAI tools.
  • Security risks: GenAI code can be less secure than company written code and especially vs. Open Source (Stanford 2023, Cornell 2023, Sema research).
  • Quality risks: GenAI code may be incorrect, code may not be understood. Coding tools may not have the appropriate contextual awareness for the problems that developers are asking GenAI to solve.
  • IP risks: Using GenAI tools may lead to organizations not having legal ownership over their code.

Each of these risks has an achievable prevention / mitigation plan:

  • Data leakage: Give coders Enterprise-grade licenses to approved LLM(s), potentially more than one, and prohibit use of other tools.
  • Security risks: Ensure GenAI code goes through security gates (SAST/ DAST, CVE) identical to other software.
  • Quality risks: Put GenAI through code quality gates identical to non GenAI code, e.g. automatic assessments and code reviews, and ensure developers understand the code they are using.
  • IP risks: Proper tooling and usage should protect GenAI coding users from legal action from training material creators and not prevent protection of the resulting IP including Trade Secret, Patent and Copyright. Note this is not legal advice and organizations should consult with Counsel. Sema AI Working Paper 02- Assessing IP Risks of Coders Using GenAI, available upon request, provides additional detail.

For More Information

Relationship to other Working Papers:

  • Working Paper 01- High ROI AI Activities. Explains the calculation behind the ROI for coders using GenAI.
  • Working Paper 02- Assessing IP Risks of Coders Using GenAI. One of the high ROI activities, Developers using GenAI tools while coding, is only worth the investment if the generated codebase can receive sufficient IP protection. Sema’s research indicates that the IP can indeed be protected, subject to your Counsel’s final determination.
  • Working Paper 03- Comparison of tiers of GitHub Copilot GenAI Coding Tools. A critical method to reduce risks from coders using GenAI tools is to manage access to the appropriate tools.

Contact ai@semasoftware.com.

Disclosure

Sema publications should not be construed as legal advice on any specific facts or circumstances. The contents are intended for general information purposes only. To request reprint permission for any of our publications, please use our “Contact Us” form.

The availability of this publication is not intended to create, and receipt of it does not constitute, an attorney-client relationship. The views set forth herein are the personal views of the authors and do not necessarily reflect those of the Firm.

Table of contents

Gain insights into your code
Get in touch

Are you ready?

Sema is now accepting pre-orders for GBOMs as part of the AI Code Monitor.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.