Skip to main content

Author Aide explained: Your questions answered

During our live webinar ‘Author Aide tips and tricks: Getting the most out of AI-powered authoring,’ presenters CEO Gavin Cooney and CPO Neil McGough ran out of time before they could answer all our attendees’ questions. But we didn’t want to leave any of our viewers hanging.

Here are the answers to all of the questions our presenters weren’t able to address live:

  1. When can we expect the functionality to embed Author Aide into our CMS?
    Author Aide is set to be available as a standalone API in late March, 2024. In April/May, we’ll also extend support to launch Author Aide from inside Author API.
  2. How is the quality of the Author Aide assessment output tested?
    We spent a lot of time tuning the behind-the-scenes prompts to ensure the highest quality output. Tests have taken place with subject matter experts (teachers), as well as customers who took part in our beta testing phase. We continue to make refinements as the models change and we receive feedback from end-users.
  3. Does the item translation feature include Dutch?
    Author Aide’s translation language support will be extended to 43 languages, including Dutch, as of March 6, 2024.
  4. Can Author Aide automatically tag questions and responses to assessment criteria?
    We’ve worked with tag generation, including Bloom’s, DOK, Subject, and (US) Standard. This is working, but we’re waiting on the next step of mapping auto-generated tags to existing tabs in customer item banks.
  5. How does the system prevent hallucinations?
    See answer #2. You can heavily mitigate hallucinations by grounding question generation in supplied context i.e. your own source/learning material, and objective/standard descriptions. However, it’s important to understand that we cannot completely safeguard against hallucinations, which is why human review is built into the product.
  6. If you’re using RAG searches, where does that internal data come from?
    Content used in RAG (which is coming soon) will be documents uploaded by end-users. They’ll be able to initially upload PDF resources, which will be stored for future item generation.
  7. Can I use Author Aide to generate content based upon the content already in Learnosity? For example, if my curator created 1,000 high-quality algebra questions which are properly tagged, can I have Author Aide create another 500 questions similar to what I already have?
    Not today, but that is on our roadmap.
  8. What formats are supported when loading data into the system for question generation?
    The first release of RAG will support PDF only. From there, we’ll look for customer feedback to guide us on which file types to support next.
  9. For copyright purposes, does the tool show whether or not there was human interaction in the item development?
    Copyright is a complex topic in this fast-moving domain. As of today, AI-generated items can only be saved in the item bank after human review. Whether that person has modified the item before (or after) saving, we don’t currently track.

Still have questions? Email for answers.