cy-ai
    Preparing search index...

    cy-ai

    Cypress AI

    NPM

    NPM version build test

    🧪 Cypress AI command that generates E2E tests with LLM (Large Language Model):

    cy.ai(string)
    

    Read the wiki.

    NPM:

    npm install cy-ai --save-dev
    

    Yarn:

    yarn add cy-ai --dev
    

    If you're using TypeScript, import the command using ES2015 syntax:

    echo "import 'cy-ai'" >> cypress/support/commands.ts
    

    Or if you're using JavaScript, use CommonJS require:

    echo "require('cy-ai')" >> cypress/support/commands.js
    

    Start the Ollama server:

    ollama serve
    

    Download the LLM:

    ollama pull qwen2.5-coder
    

    Write a test:

    // cypress/e2e/example.cy.js
    it('visits example.com', () => {
    cy.visit('/')
    cy.ai('see heading "Example Domain"')
    })
    Tip

    If you're running Chrome, disable chromeWebSecurity so the LLM requests aren't blocked by CORS:

    // cypress.config.js
    import { defineConfig } from 'cypress'

    export default defineConfig({
    chromeWebSecurity: false,
    })

    Generate Cypress tests with AI:

    cy.ai(string[, options])
    

    LangChain Runnable to invoke. Defaults to a prompt template using the Ollama model qwen2.5-coder.

    Use a different large language model:

    import { Ollama } from '@langchain/ollama'
    import { prompt } from 'cy-ai'

    const llm = new Ollama({
    model: 'codellama',
    numCtx: 16384,
    })

    const chain = prompt.pipe(llm)

    cy.ai('prompt', { llm: chain })

    Or customize the template:

    import { PromptTemplate } from '@langchain/core/prompts'
    import { Ollama } from '@langchain/ollama'

    const llm = new Ollama({
    model: 'codellama',
    numCtx: 16384,
    })

    const prompt = PromptTemplate.fromTemplate(`
    You are writing an E2E test step with Cypress.

    Rules:
    1. Return JavaScript Cypress code without "describe" and "it".

    Task: {task}

    HTML:
    \`\`\`html
    {html}
    \`\`\`
    `)

    const chain = prompt.pipe(llm)

    cy.ai('prompt', { llm: chain })
    Important

    Don't forget to pull the Ollama model:

    ollama pull codellama
    

    Whether to display the command logs. Defaults to true:

    cy.ai('prompt', { log: true })
    

    Hide Cypress and console logs:

    cy.ai('prompt', { log: false })
    

    Whether to regenerate the Cypress step with AI. Defaults to false:

    cy.ai('prompt', { regenerate: false })
    

    Regenerate the Cypress step with AI:

    cy.ai('prompt', { regenerate: true })
    

    Time to wait in milliseconds. Defaults to 2 minutes:

    cy.ai('prompt', { timeout: 120000 })
    

    Set timeout to 5 minutes:

    cy.ai('prompt', { timeout: 1000 * 60 * 5 })
    

    Configure global options for cy.ai:

    cy.aiConfig(options)
    

    Override default options:

    cy.aiConfig({
    llm: chain,
    log: false,
    regenerate: true,
    timeout: 1000 * 60 * 3, // 3 minutes
    })

    Set timeout to 5 minutes:

    cy.aiConfig({
    timeout: 1000 * 60 * 5,
    })

    See how to use Anthropic.

    MIT