Ok, so last time I said the last thing we needed was streaming responses. Well now, the last last thing we need is to seed out conversation with a system prompt.
What is a system prompt you ask? A system prompt let’s us setup the initial behaviour of the AI assistant. A good prompt will put the AI into proof editor mode or business plan expert mode for instance. By making this a first class citizen in our system, we can offer a ton of flexibility to our users.
Our system prompts will be super simple, just a big text area: rails g scaffold system_prompt title:string! content:text!
.
Next we’ll add the prompt to the Conversation
form:
<!-- app/views/conversations/_form.html.erb -->
<div>
<%= form.label :system_prompt_id, style: "display: block" %>
<%= form.collection_select :system_prompt_id, SystemPrompt.order(:title), :id, :title %>
</div>
and update the controller and model:
# app/controller/conversations_controller.rb
class ConversationsController < ApplicationController
def conversation_params
params.expect(conversation: [ :title, :system_prompt_id ])
end
end
# app/models/conversation.rb
class Conversation < ApplicationRecord
attr_accessor :system_prompt_id # avoid assignment errors
after_create_commit :add_system_prompt
private
def add_system_prompt
return unless system_prompt_id
system_prompt = SystemPrompt.find(system_prompt_id)
messages.create(role: :system, message: system_prompt.content)
end
end
That’s it. Not magical, but this gives the user the ability to easily tweak the system to give them exactly what they need to work with the AI.