Beyond the Default - Building a Smarter AI ChatBox in Oracle APEX
Voice input, timestamps, retry, quick prompts, and export - all built on top of the native Show AI Assistant action.
Introduction
A few weeks back, I was exploring what Oracle APEX could do with AI integration. I came across the built-in Show AI Assistant dynamic action and honestly, I was impressed - dropped it on a page and had a working chat interface in minutes. But then I started actually using it, and the same frustrations kept coming up.
There was no timestamp on messages, so after a long conversation I had no idea when something was said. I'd get an answer I wasn't happy with and have to retype the whole question again. I wanted to just speak my question instead of typing. And when I had a useful chat, there was no way to save it. I thought - surely I can fix these things without modifying the core widget?
So I spent some time digging into the DOM structure that APEX generates for the dialog. What I found was that by using a MutationObserver to watch when the dialog renders, I could inject my own enhancements cleanly on top - no hacks, no touching the native component. Here's everything I ended up building:
-
Voice Input - Speak your question instead of typing. Uses the browser's Web Speech API and works smoothly in Chrome.
- Message Timestamps - Every message, both user and AI responses, automatically displays the sent or received time.
- Retry Button - Quickly resend your last question with a single click without typing it again.
- Quick Prompt Chips - One-click shortcuts for common actions like Summarise, Write Email, Fix Error, Translate to Tamil, and Explain Simply.
- Export Chat - Download the complete conversation as a
.txtfile or open a print-friendly PDF view for future reference. - Text to Speech - Listen to AI responses using the built-in Read button available for every reply.
Prerequisites - Wiring Up Generative AI
Before any of this works, you have to connect APEX to an AI provider. Initially I looked at OpenAI, but I went with Cohere instead - no credit card needed for the trial key, and the free tier was enough for what I was building. Here's exactly how I set it up.
Step 1 Create a Cohere Account & Get Your API Key
Go to dashboard.cohere.com and create a free account. Once you're logged in, head to API Keys in the left sidebar, click New Trial Key, give it any name, and hit Generate. One thing I learned the hard way - copy that key the moment it appears. Once you navigate away, you can't retrieve it again and you'll have to generate a new one.
Step 2 Open Workspace Utilities
I initially spent a minute hunting for this. There are two ways in - either through App Builder → (dropdown arrow) → Workspace Utilities, or through the hamburger menu at the top left: ≡ → Workspace → Workspace Utilities. Both get you to the same place.
Step 3 Select Generative AI
On the Workspace Utilities page you'll see a list of options. Find and click Generative AI - this is where you manage and create AI service configurations that your APEX apps can reference.
Step 4 Create the AI Service
Click Create and fill in the form exactly as shown below. I'm sharing this table because I got a few field values wrong on my first attempt and wasted time debugging:
| Attribute | Value |
|---|---|
| Name | Cohere AI |
| Static ID | COHERE_AI |
| Provider | Cohere |
| Base URL | https://api.cohere.ai/v1 |
| Model ID | command-r-plus |
| Credential | Click + to create new |
| Credential Name | COHERE_CRED |
| Client ID | COHERE |
| Client Secret / Password | your_api_key_here |
| App Builder Access | Enabled |
Step 5 Configure the AI Tab
This one caught me off guard. I set up the service, added the dynamic action, ran the page - and got a "static ID not found" error. After some head-scratching I realised I'd missed this step entirely. You need to go to Edit Application Definition → AI tab and select your service from the dropdown there. The "Content Message" is just a consent notice for users - you can leave it blank if you want, but don't skip the service selection or you'll hit that same error.
Setting Up the Page
1. Add a Button
I created a simple button and labelled it Chat with AI. Nothing special here - it's just the trigger that opens the assistant dialog when clicked.
2. Create a Dynamic Action on the Button Click
Right-click the button and create a dynamic action. The setup I used:
- Event: Click
- True Action 1: Show AI Assistant
For the Show AI Assistant action itself, these are the attribute values I configured:
| Attribute | Value |
|---|---|
| Title | Assistant |
| Generative AI Service | Cohere AI |
| System Prompt | You are a helpful assistant. |
| Welcome Message | Hi! How can I help you today? |
The JavaScript - Function and Global Variable Declaration
This is the heart of the whole thing. When I first tried adding the mic button and toolbar directly in the page load dynamic action, they weren't rendering - the dialog DOM simply didn't exist yet at that point. That's when I switched to using a MutationObserver, which watches for the dialog to appear and then fires all the enhancement functions. Much more reliable.
Paste everything below into the page's Function and Global Variable Declaration section. It sets up the mic, toolbar, export functions, timestamps, and the observer - all the pieces that the boot script (next section) will call.
// Global Variables
var apexMicRecognition = null;
var apexMicListening = false;
var apexLastMessage = '';
// Inject shared styles
function injectChatStyles() {
if (document.getElementById('apex-chat-styles')) return;
var style = document.createElement('style');
style.id = 'apex-chat-styles';
style.textContent = `
@keyframes mic-pulse {
0% { box-shadow: 0 0 0 0 rgba(229,57,53,0.6); }
70% { box-shadow: 0 0 0 12px rgba(229,57,53,0); }
100% { box-shadow: 0 0 0 0 rgba(229,57,53,0); }
}
@keyframes apex-spin { to { transform: rotate(360deg); } }
#apex-mic-btn {
position: absolute; right: 8px; top: 50%;
transform: translateY(-50%);
width: 34px; height: 34px; border-radius: 50%;
background: #0572ce; border: none; cursor: pointer;
display: flex; align-items: center; justify-content: center;
box-shadow: 0 2px 8px rgba(5,114,206,0.35);
z-index: 9999; transition: background 0.2s;
}
#apex-mic-btn.listening {
background: #e53935 !important;
animation: mic-pulse 1.2s infinite;
}
#apex-chat-toolbar {
display: flex; align-items: center;
gap: 6px; padding: 6px 10px 4px 10px;
border-top: 1px solid #e5e8ec;
flex-wrap: wrap; background: #fff;
}
.apex-quick-prompt {
padding: 4px 10px; border-radius: 14px;
border: 1px solid #0572ce; background: #fff;
color: #0572ce; font-size: 11px; cursor: pointer;
font-family: sans-serif; white-space: nowrap;
transition: background 0.15s, color 0.15s;
}
.apex-quick-prompt:hover { background: #0572ce; color: #fff; }
.apex-tool-btn {
width: 30px; height: 30px; border-radius: 50%;
border: none; background: transparent; cursor: pointer;
display: flex; align-items: center; justify-content: center;
color: #6c7a8d; transition: background 0.2s, color 0.2s;
margin-left: auto; flex-shrink: 0;
}
.apex-tool-btn:hover { background: #e8f0fe; color: #0572ce; }
.apex-tool-btn.active { color: #0572ce; background: #e8f0fe; }
#apex-export-menu {
display: none; position: absolute;
bottom: 100%; right: 0;
background: #fff; border: 1px solid #dde1e7;
border-radius: 10px; box-shadow: 0 4px 16px rgba(0,0,0,0.12);
z-index: 99999; overflow: hidden; min-width: 140px;
}
#apex-export-menu.visible { display: block; }
.apex-export-opt {
display: flex; align-items: center; gap: 8px;
padding: 10px 16px; cursor: pointer;
font-size: 13px; font-family: sans-serif; color: #333;
transition: background 0.15s;
}
.apex-export-opt:hover { background: #f0f4ff; }
.apex-msg-time {
font-size: 10px; color: #9aa3af;
font-family: sans-serif;
display: block; margin-top: 3px;
}
.apex-msg-actions {
display: flex; gap: 6px; margin-top: 6px;
}
.apex-msg-action-btn {
background: none; border: 1px solid #dde1e7;
border-radius: 12px; padding: 2px 8px;
font-size: 11px; cursor: pointer; color: #6c7a8d;
font-family: sans-serif;
transition: background 0.15s, color 0.15s;
display: flex; align-items: center; gap: 4px;
}
.apex-msg-action-btn:hover { background: #e8f0fe; color: #0572ce; border-color: #0572ce; }
.apex-msg-action-btn.speaking { color: #e53935; border-color: #e53935; }
#apex-toolbar-right {
position: relative; display: flex;
align-items: center; gap: 4px; margin-left: auto;
}
`;
document.head.appendChild(style);
}
// Set textarea value (APEX-safe)
function setTextareaValue(input, text) {
input.focus();
var nativeSetter = Object.getOwnPropertyDescriptor(
window.HTMLTextAreaElement.prototype, 'value'
).set;
nativeSetter.call(input, text);
input.dispatchEvent(new Event('input', { bubbles: true }));
input.dispatchEvent(new Event('change', { bubbles: true }));
}
// Format current time
function apexNow() {
var d = new Date();
var h = d.getHours(), m = d.getMinutes();
var ampm = h >= 12 ? 'PM' : 'AM';
h = h % 12 || 12;
return h + ':' + (m < 10 ? '0' + m : m) + ' ' + ampm;
}
// Get all chat message rows
function getAllChatMessages() {
return document.querySelectorAll('li.a-ChatItem-row');
}
// Parse role + text from a message element
function parseMessage(msg) {
var isUser = msg.classList.contains('a-ChatItem-row--outbound');
var textEl = msg.querySelector('div.a-ChatItem-message');
var clone = textEl ? textEl.cloneNode(true) : null;
if (clone) {
clone.querySelectorAll('.apex-msg-time, .apex-msg-actions').forEach(function (el) {
el.remove();
});
}
return {
isUser: isUser,
role: isUser ? 'You' : 'AI Assistant',
text: clone ? clone.innerText.trim() : '',
textEl: textEl
};
}
// 1. EXPORT AS TXT
function exportChatAsTxt() {
var messages = getAllChatMessages();
if (!messages.length) { alert('No chat history to export.'); return; }
var lines = ['CHAT HISTORY - ' + new Date().toLocaleString(), '='.repeat(40), ''];
messages.forEach(function (msg) {
var parsed = parseMessage(msg);
if (!parsed.text) return;
lines.push(parsed.role + ':');
lines.push(parsed.text);
lines.push('');
});
var blob = new Blob([lines.join('\n')], { type: 'text/plain' });
var a = document.createElement('a');
a.href = URL.createObjectURL(blob);
a.download = 'chat-history-' + Date.now() + '.txt';
a.click();
console.log('TXT exported');
}
// 2. EXPORT AS PDF (Print)
function exportChatAsPdf() {
var messages = getAllChatMessages();
if (!messages.length) { alert('No chat history to export.'); return; }
var rows = [];
messages.forEach(function (msg) {
var parsed = parseMessage(msg);
if (!parsed.text) return;
var timeEl = msg.querySelector('.apex-msg-time');
rows.push({
isUser: parsed.isUser,
role: parsed.role,
text: parsed.text,
time: timeEl ? timeEl.textContent : apexNow()
});
});
if (!rows.length) { alert('No chat history to export.'); return; }
var html = `
<html><head><title>Chat History</title>
<style>
body { font-family: Arial, sans-serif; padding: 30px; color: #333; max-width: 800px; margin: 0 auto; }
h2 { color: #0572ce; border-bottom: 2px solid #0572ce; padding-bottom: 8px; }
.wrap { display: flex; flex-direction: column; gap: 12px; }
.msg { padding: 12px 16px; border-radius: 12px; max-width: 75%; }
.user { background: #e8f0fe; align-self: flex-end; }
.ai { background: #f5f5f5; align-self: flex-start; }
.role { font-weight: bold; font-size: 12px; color: #0572ce; margin-bottom: 6px; }
.text { font-size: 14px; line-height: 1.5; white-space: pre-wrap; }
.time { font-size: 10px; color: #999; margin-top: 6px; }
</style></head>
<body>
<h2>💬 Chat History - ` + new Date().toLocaleString() + `</h2>
<div class="wrap">`;
rows.forEach(function (r) {
html += '<div class="msg ' + (r.isUser ? 'user' : 'ai') + '">';
html += '<div class="role">' + r.role + '</div>';
html += '<div class="text">' + r.text.replace(/</g,'<').replace(/>/g,'>').replace(/\n/g,'<br>') + '</div>';
html += '<div class="time">' + r.time + '</div>';
html += '</div>';
});
html += '</div></body></html>';
var win = window.open('', '_blank');
win.document.write(html);
win.document.close();
setTimeout(function () { win.print(); }, 600);
console.log('PDF print opened');
}
// 3. TEXT TO SPEECH
function speakText(text, btn) {
if (!window.speechSynthesis) { alert('TTS not supported in this browser.'); return; }
if (window.speechSynthesis.speaking) {
window.speechSynthesis.cancel();
document.querySelectorAll('.apex-msg-action-btn.speaking').forEach(function (b) {
b.classList.remove('speaking');
b.innerHTML = '🔊 Read';
});
return;
}
var utterance = new SpeechSynthesisUtterance(text);
utterance.lang = 'en-US';
utterance.rate = 1;
utterance.onend = function () {
if (btn) { btn.classList.remove('speaking'); btn.innerHTML = '🔊 Read'; }
};
utterance.onerror = function () {
if (btn) { btn.classList.remove('speaking'); btn.innerHTML = '🔊 Read'; }
};
if (btn) { btn.classList.add('speaking'); btn.innerHTML = '⏹ Stop'; }
window.speechSynthesis.speak(utterance);
}
// 4. TIMESTAMPS + TTS + RETRY on messages
function addMessageEnhancements() {
getAllChatMessages().forEach(function (msg) {
if (msg.getAttribute('data-apex-enhanced')) return;
msg.setAttribute('data-apex-enhanced', '1');
var parsed = parseMessage(msg);
if (!parsed.textEl) return;
// Timestamp
var timeSpan = document.createElement('span');
timeSpan.className = 'apex-msg-time';
timeSpan.textContent = apexNow();
parsed.textEl.appendChild(timeSpan);
// TTS + Retry on AI messages only
if (!parsed.isUser) {
var actionsDiv = document.createElement('div');
actionsDiv.className = 'apex-msg-actions';
var ttsBtn = document.createElement('button');
ttsBtn.className = 'apex-msg-action-btn';
ttsBtn.innerHTML = '🔊 Read';
ttsBtn.addEventListener('click', function () {
var clone = parsed.textEl.cloneNode(true);
clone.querySelectorAll('.apex-msg-time, .apex-msg-actions').forEach(function (el) {
el.remove();
});
speakText(clone.innerText.trim(), ttsBtn);
});
var retryBtn = document.createElement('button');
retryBtn.className = 'apex-msg-action-btn';
retryBtn.innerHTML = '🔁 Retry';
retryBtn.addEventListener('click', function () {
if (!apexLastMessage) { alert('No previous message to retry.'); return; }
var input = document.querySelector('textarea.a-ChatInput-text');
if (input) {
setTextareaValue(input, apexLastMessage);
setTimeout(function () {
var sendBtn = document.querySelector('button.a-ChatInput-button--send');
if (sendBtn && !sendBtn.disabled) sendBtn.click();
}, 200);
}
});
actionsDiv.appendChild(ttsBtn);
actionsDiv.appendChild(retryBtn);
parsed.textEl.after(actionsDiv);
}
// Track last user message for Retry
if (parsed.isUser) {
apexLastMessage = parsed.text;
}
});
}
// 5. QUICK PROMPTS + EXPORT TOOLBAR
function initChatToolbar() {
if (document.getElementById('apex-chat-toolbar')) return;
var chatInput = document.querySelector('div.a-ChatInput');
var input = document.querySelector('textarea.a-ChatInput-text');
if (!chatInput || !input) return;
var toolbar = document.createElement('div');
toolbar.id = 'apex-chat-toolbar';
// Quick prompt buttons
var prompts = [
{ label: '📝 Summarise', text: 'Please summarise the above in bullet points.' },
{ label: '📧 Write Email', text: 'Write a professional email about: ' },
{ label: '🔧 Fix Error', text: 'I have this error, please help fix it:\n\n' },
{ label: '🌐 Translate Tamil', text: 'Translate the following to Tamil:\n\n' },
{ label: '💡 Explain Simply', text: 'Explain this in simple terms:\n\n' }
];
prompts.forEach(function (p) {
var btn = document.createElement('button');
btn.className = 'apex-quick-prompt';
btn.textContent = p.label;
btn.addEventListener('click', function () {
setTextareaValue(input, p.text);
input.focus();
input.setSelectionRange(p.text.length, p.text.length);
});
toolbar.appendChild(btn);
});
// Right side - export button + dropdown menu
var rightWrap = document.createElement('div');
rightWrap.id = 'apex-toolbar-right';
var exportBtn = document.createElement('button');
exportBtn.className = 'apex-tool-btn';
exportBtn.title = 'Export Chat';
exportBtn.innerHTML = `
<svg viewBox="0 0 24 24" width="16" height="16" fill="none" stroke="currentColor"
stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<path d="M21 15v4a2 2 0 0 1-2 2H5a2 2 0 0 1-2-2v-4"/>
<polyline points="7 10 12 15 17 10"/>
<line x1="12" y1="15" x2="12" y2="3"/>
</svg>`;
var exportMenu = document.createElement('div');
exportMenu.id = 'apex-export-menu';
exportMenu.innerHTML = `
<div class="apex-export-opt" id="apex-export-txt">
<span>📄</span> Download TXT
</div>
<div class="apex-export-opt" id="apex-export-pdf">
<span>🖨️</span> Print / PDF
</div>`;
exportBtn.addEventListener('click', function (e) {
e.stopPropagation();
exportMenu.classList.toggle('visible');
});
document.addEventListener('click', function () {
exportMenu.classList.remove('visible');
});
exportMenu.querySelector('#apex-export-txt').addEventListener('click', function () {
exportChatAsTxt();
exportMenu.classList.remove('visible');
});
exportMenu.querySelector('#apex-export-pdf').addEventListener('click', function () {
exportChatAsPdf();
exportMenu.classList.remove('visible');
});
rightWrap.appendChild(exportBtn);
rightWrap.appendChild(exportMenu);
toolbar.appendChild(rightWrap);
chatInput.parentNode.insertBefore(toolbar, chatInput);
}
// MIC BUTTON
function initApexMicButton() {
var input = document.querySelector('textarea.a-ChatInput-text');
if (!input) return false;
if (document.getElementById('apex-mic-btn')) return true;
var textWrap = document.querySelector('div.a-ChatInput-textWrap');
if (!textWrap) return false;
textWrap.style.position = 'relative';
input.style.paddingRight = '50px';
var micBtn = document.createElement('button');
micBtn.id = 'apex-mic-btn';
micBtn.type = 'button';
micBtn.title = 'Click to speak';
micBtn.innerHTML = `
<svg viewBox="0 0 24 24" width="17" height="17" fill="white" xmlns="http://www.w3.org/2000/svg">
<path d="M12 1a4 4 0 0 1 4 4v6a4 4 0 0 1-8 0V5a4 4 0 0 1 4-4zm6 9a1 1 0 0 1 2 0
A8 8 0 0 1 13 17.93V20h2a1 1 0 0 1 0 2H9a1 1 0 0 1 0-2h2v-2.07
A8 8 0 0 1 4 10a1 1 0 0 1 2 0 6 6 0 0 0 12 0z"/>
</svg>`;
textWrap.appendChild(micBtn);
var SpeechRecognition = window.SpeechRecognition || window.webkitSpeechRecognition;
if (!SpeechRecognition) {
micBtn.style.opacity = '0.4';
micBtn.style.cursor = 'not-allowed';
micBtn.title = 'Use Chrome for speech support.';
return true;
}
apexMicRecognition = new SpeechRecognition();
apexMicRecognition.lang = 'en-US';
apexMicRecognition.interimResults = false;
apexMicRecognition.maxAlternatives = 1;
micBtn.addEventListener('click', function (e) {
e.preventDefault(); e.stopPropagation();
apexMicListening ? apexMicRecognition.stop() : apexMicRecognition.start();
});
apexMicRecognition.onstart = function () {
apexMicListening = true;
micBtn.classList.add('listening');
micBtn.title = 'Listening… click to stop';
};
apexMicRecognition.onresult = function (event) {
var transcript = event.results[event.results.length - 1][0].transcript.trim();
apexLastMessage = transcript;
setTextareaValue(input, transcript);
};
apexMicRecognition.onend = function () {
apexMicListening = false;
micBtn.classList.remove('listening');
micBtn.title = 'Click to speak';
setTimeout(function () {
var sendBtn = document.querySelector('button.a-ChatInput-button--send');
if (sendBtn && !sendBtn.disabled) sendBtn.click();
}, 400);
};
apexMicRecognition.onerror = function (e) {
console.warn('Mic error:', e.error);
apexMicListening = false;
micBtn.classList.remove('listening');
micBtn.title = 'Click to speak';
};
return true;
}
// Init all features
function initAllChatFeatures() {
injectChatStyles();
initApexMicButton();
initChatToolbar();
}
// Watch transcript for new messages
function watchTranscript() {
var transcript = document.querySelector('div.a-ChatTranscript');
if (!transcript) return false;
addMessageEnhancements();
var observer = new MutationObserver(function () {
setTimeout(function () {
addMessageEnhancements();
if (!document.getElementById('apex-mic-btn')) initApexMicButton();
if (!document.getElementById('apex-chat-toolbar')) initChatToolbar();
}, 300);
});
observer.observe(transcript, { childList: true, subtree: true });
return true;
}
Page Load Dynamic Action - Boot Script
Now create a Dynamic Action on Page Load. Add the following as the first True Action (Execute JavaScript). This observer waits for the dialog's DOM elements to exist before firing off all the enhancement functions - important because the AI dialog is rendered lazily after you click the button.
function bootChatFeatures() {
injectChatStyles();
var bootObserver = new MutationObserver(function () {
var ready = document.querySelector('div.a-ChatInput-actions') &&
document.querySelector('div.a-ChatInput-textWrap') &&
document.querySelector('div.a-ChatTranscript');
if (!ready) return;
initAllChatFeatures();
watchTranscript();
bootObserver.disconnect();
console.log('All chat features ready');
});
bootObserver.observe(document.body, { childList: true, subtree: true });
// Try immediately in case dialog already open
if (document.querySelector('div.a-ChatInput-actions') &&
document.querySelector('div.a-ChatTranscript')) {
initAllChatFeatures();
watchTranscript();
}
}
bootChatFeatures();
Optional - Custom CSS Styling
If you want the chat dialog to look a bit more polished than the default, paste the CSS below into your page's Inline CSS field (under Page → CSS → Inline). It themes the dialog with a dark teal header, coral send button, and styled message bubbles. Completely optional - the features work fine without it.
/* Dialog / Modal Container */
.a-Dialog--aiAssistant .a-Dialog-header,
.a-ChatDialog .a-Dialog-header {
background: #1e4a5c !important;
border-bottom: 2px solid #e8503a !important;
color: #ffffff !important;
padding: 14px 20px !important;
}
.a-Dialog--aiAssistant .a-Dialog-title,
.a-ChatDialog .a-Dialog-title {
color: #ffffff !important;
font-weight: 600 !important;
font-size: 15px !important;
letter-spacing: 0.3px !important;
}
.a-Dialog--aiAssistant,
.a-ChatDialog {
background: #f4f3ee !important;
border: none !important;
border-radius: 12px !important;
box-shadow: 0 8px 40px rgba(26, 37, 53, 0.35) !important;
overflow: hidden !important;
}
/* Close / Header Buttons */
.a-Dialog--aiAssistant .a-Dialog-closeButton,
.a-ChatDialog .a-Dialog-closeButton {
color: #ffffff !important;
opacity: 0.8 !important;
}
.a-Dialog--aiAssistant .a-Dialog-closeButton:hover,
.a-ChatDialog .a-Dialog-closeButton:hover {
opacity: 1 !important;
color: #e8503a !important;
}
/* Chat Transcript Area */
.a-ChatTranscript {
background: #f4f3ee !important;
padding: 16px 14px !important;
}
/* AI (Inbound) Message Bubble */
.a-ChatItem-row--inbound .a-ChatItem-message {
background: #ffffff !important;
color: #1a2535 !important;
border: 1px solid #dde3e8 !important;
border-radius: 4px 14px 14px 14px !important;
box-shadow: 0 1px 4px rgba(26,37,53,0.08) !important;
font-size: 14px !important;
line-height: 1.6 !important;
padding: 10px 14px !important;
}
/* User (Outbound) Message Bubble */
.a-ChatItem-row--outbound .a-ChatItem-message {
background: #1e4a5c !important;
color: #ffffff !important;
border: none !important;
border-radius: 14px 4px 14px 14px !important;
box-shadow: 0 1px 6px rgba(30,74,92,0.25) !important;
font-size: 14px !important;
line-height: 1.6 !important;
padding: 10px 14px !important;
}
/* AI Avatar */
.a-ChatItem-avatar {
background: #e8503a !important;
color: #ffffff !important;
border-radius: 50% !important;
border: 2px solid #ffffff !important;
box-shadow: 0 2px 6px rgba(232,80,58,0.35) !important;
}
/* Input Area */
.a-ChatInput {
background: #ffffff !important;
border-top: 2px solid #1e4a5c !important;
padding: 10px 12px !important;
}
.a-ChatInput-textWrap {
background: #f4f3ee !important;
border: 1.5px solid #c8d0d8 !important;
border-radius: 10px !important;
transition: border-color 0.2s !important;
}
.a-ChatInput-textWrap:focus-within {
border-color: #1e4a5c !important;
box-shadow: 0 0 0 3px rgba(30,74,92,0.12) !important;
}
.a-ChatInput-text {
background: transparent !important;
color: #1a2535 !important;
font-size: 14px !important;
caret-color: #e8503a !important;
}
.a-ChatInput-text::placeholder {
color: #8a97a6 !important;
}
/* Send Button */
.a-ChatInput-button--send {
background: #e8503a !important;
color: #ffffff !important;
border: none !important;
border-radius: 8px !important;
box-shadow: 0 2px 8px rgba(232,80,58,0.35) !important;
transition: background 0.2s, transform 0.1s !important;
}
.a-ChatInput-button--send:hover {
background: #c83d2a !important;
transform: scale(1.05) !important;
}
.a-ChatInput-button--send:disabled {
background: #b0b8c1 !important;
box-shadow: none !important;
transform: none !important;
}
/* Typing / Loading Indicator */
.a-ChatItem--loading .a-ChatItem-message {
background: #ffffff !important;
border: 1px solid #dde3e8 !important;
border-radius: 4px 14px 14px 14px !important;
}
.a-ChatItem--loading .a-ChatItem-loadingDot {
background: #1e4a5c !important;
}
/* Scrollbar inside transcript */
.a-ChatTranscript::-webkit-scrollbar { width: 5px; }
.a-ChatTranscript::-webkit-scrollbar-track { background: #f4f3ee; }
.a-ChatTranscript::-webkit-scrollbar-thumb { background: #1e4a5c; border-radius: 4px; }
.a-ChatTranscript::-webkit-scrollbar-thumb:hover { background: #e8503a; }
/* Quick Prompt Chips */
.apex-quick-prompt {
border-color: #1e4a5c !important;
color: #1e4a5c !important;
background: #ffffff !important;
}
.apex-quick-prompt:hover {
background: #1e4a5c !important;
color: #ffffff !important;
}
/* Mic Button */
#apex-mic-btn {
background: #1e4a5c !important;
box-shadow: 0 2px 8px rgba(30,74,92,0.35) !important;
}
#apex-mic-btn.listening { background: #e8503a !important; }
/* Toolbar & Export */
#apex-chat-toolbar {
background: #f4f3ee !important;
border-top: 1.5px solid #dde3e8 !important;
}
.apex-tool-btn:hover {
background: rgba(30,74,92,0.1) !important;
color: #1e4a5c !important;
}
#apex-export-menu { border-color: #dde3e8 !important; }
.apex-export-opt:hover { background: #eaf2f5 !important; }
/* Message Action Buttons */
.apex-msg-action-btn {
border-color: #c8d0d8 !important;
color: #5a6a7a !important;
}
.apex-msg-action-btn:hover {
background: #eaf2f5 !important;
color: #1e4a5c !important;
border-color: #1e4a5c !important;
}
.apex-msg-action-btn.speaking {
color: #e8503a !important;
border-color: #e8503a !important;
}
/* Timestamp */
.apex-msg-time { color: #8a97a6 !important; }
Wrapping Up
That's everything. Once you deploy the page you should see the quick prompt chips appear just above the input box, a mic button sitting inside the textarea, and timestamps on every message. AI responses will have Read and Retry buttons below them, and the download icon in the toolbar gives you the TXT/PDF export options.
The whole thing runs without any external libraries - just vanilla JavaScript and a few DOM APIs that are available in every modern browser. The MutationObserver approach means it adapts cleanly to whatever APEX renders, even if the dialog is opened and closed multiple times in the same session.
Feel free to extend the quick prompts array with whatever shortcuts make sense for your app - that's the easiest part to customise. Hope this helps someone skip a few hours of trial and error!
Comments
Post a Comment