Chat icon
By Tim Warren on June 07, 2023

Organisational Readiness: AI and GPT

Organisational Readiness: AI and GPT 

AI and GPT – what the heck happened?  It’s an undeniable fact that GPT and AI have formed a formidable technological tsunami in businesses around the world.  But one key question needs to be addressed – is your organisation ready?  

We’ve summarised a few key points as a checklist to help you establish your organisational readiness for all things Artificial Intelligence (AI) and GPT (Generative Pre-trained Transformers)…

Define who owns the AI policies:

This is key, as someone needs to be responsible for your organisation’s AI policies – it can’t be left to chance.  In an SME (small to medium enterprise), this is most likely to be the CEO (Chief Executive Officer), within a large organisation, the COO (Chief Operating Officer) or CRO (Chief Revenue Officer) might take ownership.  

In time, you can select or appoint a Key Owner of AI and related policies.

Define your RACI (Who is Responsible, Accountable, Consulted, Informed?)

Doing this will clarify responsibilities and ensure that everything needed is clearly assigned.

It might look like this: 

RACI

Initially

Ultimately

Responsible 

Chief Operating Officer

Head of AI

Accountable

Board

Chief Operating Office

Consulted

Executive

Executive and Management

Informed

Entire company

Entire company



Do a SWOT analysis

Carrying out a SWOT (strengths, weaknesses, opportunities, threats) analysis will highlight the risks and threats to the business model and the operations.

It will also define the opportunities that the technology can bring to the business, both existing and new.  For example:

    • how can you use AI to Improve current operations?
    • can you improve margins / lower costs / improve output / increase productivity?
    • what do you already know about AI?
    • have you got any internal resources that have help to lead this effort?

 

Policy and Frameworks 

Start early in forming the basis of your AI/GPT policies and frameworks. If you have a knowledgeable Key Owner, then consider this as an internal position. If it can’t be managed in house, consider engaging a suitable partner to assist in drafting these. Policies are important to get right so get help if you need it.

Once completed, review your contracts with partners and customers to see what rights you must have to use external tools. 

Policies should be shared with and understood by your staff – anyone that is likely to be using AI/GPT functionality as part of their role. 

 

Add key risks from AI and GPT to the risk register

This is the register held at executive/board level. Add key risks as they arise so they can be reviewed, managed and/or mitigated. Create a risk review cycle – monthly would be a typical review cadence.

Side note : Don't have a risk register? It's time you got one, Contact Us for a free template.

 

Work on an executive outline for the board

Create an outline of the organisational readiness for AI and GPT. Ensure the board is paying sufficient attention – it’s key to get it right, it’s key everyone understands it and it’s critical that everyone follows it.

 

Agree what tasks external tools CAN be used for:

These could be wide-reaching across your business.  For general and social media content destined to be public is most likely not a problem.

For example, ‘Write a LinkedIn post welcoming Sarah to the HR team at ABCD.com’

(Helpful tip, best practice is to use a fake name and replace it after creating the content). 

Another low-risk request could be ‘Create an email to customers wishing them a happy new year’

This content is low risk in terms of breaching security/data etc.

 

Agree what tasks external tools CANNOT be used for:

This area is far more critical and needs to be clearly defined and understood.

Any and all data passed to Chat GPT is saved and stored and used as training data.  Therefore, the golden rule is never put anything into a public GPT product that you wouldn’t want published on a news website. 

Examples include:

    • not including anyone’s personal details 
    • not using and identifying information for any person or entity
    • don't send data to GPT that uses customer data.
    • don’t use any internal company IP (intellectual property)
    • don’t use any non-public domain IP in external tools 

Example: don't ask Chat GPT this: 

“Write an email to our customers X.com, Y.com, and Z.com thanking them for signing up to the Pro plan”

Problem: This identifies customer names and what they have bought 

 

“Write a thank-you letter to John Smith of 445 Ridge Rd, CA, 40411 thanking him for the payment of his credit card balance of $990.”

Problem: Personal data

 

“Write an internal guide on how to use this API:

 Obj MySecureData = SpecialData;

MySecureData.secretpassword == ‘qw34dbfdoglegs’;”

Problem: IP and secrets shared 



In summary

AI and GPT are very new technologies so we’re all still learning.  Putting some basic plans in place to manage your businesses’ use of AI and GPT will help to protect your data, IP etc and prevent your organisation being in the news for all the wrong reasons.

 

Checklist

    • define who owns the policies
    • define your RACI
    • do a SWOT analysis 
    • create policies and frameworks
    • run a risk register – add and review monthly 
    • draft an executive outline for AI/GPT organisational readiness
    • agree what tasks can and cannot be carried out using external tools
    • communicate the policies and guidelines to your team, customers, partners




Published by Tim Warren June 7, 2023