• Option 1 – Solution exploration and initial data assessment

    If you have an idea or a data science challenge that you believe may solve a problem or improve a process we will work with you to scope a new data application and assess the feasibility of the idea including a high-level review of your data and high-level cost benefit appraisal.

    Typically no charge, no commitment.

    Option 2 – Create Proof of concept for a new solution.

    We produce a functioning prototype solution to demonstrate the feel, functionality, and potential value of your data solution. Use this to prove the application in your organisation and decide whether a wider roll-out is suitable.

    Typically a fixed price engagement depending on complexity/scope of the solution.

    Option 3 - Full solution adoption

    Usually following a Proof of Concept. We provide a fixed price quote for the creation and implementation of your solution . In addition to traditional purchase (fixed price + licence), we can also offer a complete SaaS option (monthly fee, low commitment) for most new solutions.

    Our existing solutions can also be adapted and configured to suit your circumstances. SaaS and traditional price options available.

  • Typical steps in creating an AI solution

    Step 1: Your data

    Do you have data that captures the key variables of your process? Most organisations are drowning in data, but a common worry is that data is often spread across different systems or of questionable quality. AI needs data and data feeds and so this is always the first consideration once the drivers for the AI solutions have been established. Pulling from variable data sources, verifying data, and cleaning data sets of errors and misleading readings is a standard part of this process.

     

    Step 2: Selecting or creating a data model to fit your application

    A model will describe how the variables interact. For example, a bread making data model might describe how baking time changes with different oven temperatures, ingredient quantities and baking moisture levels. Sometimes research has already been done to investigate the relationship between the variables and a data model can be selected from online open sources. However, in many cases a new model will need to be created by examining the data to confirm various hypotheses e.g. will a lower oven temperature and longer cooking time result in a loaf of better or inferior quality? By examining how multiple variables interact, and seeking statistically reliable relationships between them, a best-fit data model (sometimes named an algorithm) is created that can describe expected outcomes from given inputs. Complex AI solutions often involve multiple data models.

    Step 3: Teach and test

    This is an important step where the full strength of the data is converted into ‘artificial experience.’ Teaching can be both a human process and a machine process, typically with a process expert providing input on some of the significant real events that might have happened and describing how it shows in the data. The machine is left to crunch the years of more mundane data.

    The model is tested for accuracy by allowing it to predict results and then observing how closely they match real world outcomes. Final calibration and adjustments can be made to improve the match.

    Step 4: Operationalise

    Arguably, this is the step that differentiates AI research projects from commercially deployed AI solutions, for example Alexa having immediate answers and Tesla in driving mode without crashing. These AI models work because they are almost perfectly operationalised i.e. the theory is matched by the way they work in reality. Not all solutions need to operate in real-time, but all solutions need to have robust data feeds and an ability to compute quickly so that the user gets feedback as soon as it is needed. Scalability to deal with large quantities of incoming data or users is also an important element of operationalisation. Typically the hardware for measuring and feeding latest data into the data model needs to be in place to support the speed of the process. Cloud computing is often the facilitator in many solutions allowing input data to be computed in microseconds and served back to the user with an answer. Whatever the AI challenge, how to operationalise the solution so that the user’s experience is seamless is a critical step.

  • broken image

    To find out more about how Collaborate Water can use the power of your data to transform your operations:

     

    Email us: info@collaboratewater.com

     

    Sign up for our newsletter: