How to Debug Chat in OMate (Context Log)

In the conversation interface, we can view the final prompt sent by OMate to the API through the “Context Log.”

  1. Click the more button in the upper right corner of the conversation interface
  2. Select Context Log (context&log)

47cd5bdc-edad-4af2-9a20-79c27bc04e78

Currently, mainstream large model APIs do not have memory, so each conversation is entirely composed of the context we provide as its memory and instructions. Therefore, almost all optimization issues can be located by checking the context log. Some common issues include:

  • Although examples of the conversation were specified in the role settings, a new format was used in the conversation history, ultimately resulting in an output format that does not meet expectations
  • System prompts and role prompts contradict each other in some places, causing the model to be unsure which to follow
  • Suspecting that a certain setting was enabled or certain knowledge bases were connected, but they did not take effect in the conversation. Through the context log, it can be clearly seen whether they took effect
  • After starting a new topic, content from previous topics still appears during the conversation. By checking the context, you can locate which part of the memory intervened

f2833e0e-de97-4c1f-a699-47b70298ac48

In addition to the final context, we also provide a Thinking Log, which outputs some behaviors of OMate during the context stitching process for troubleshooting and locating issues.

We hope everyone makes good use of this tool to locate and solve problems more quickly.

1 Like