Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Added new API method withLlmCustomAttributes to run a function in a LLM context #2437

Merged
merged 39 commits into from
Aug 22, 2024
Merged
Show file tree
Hide file tree
Changes from 35 commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
2361240
feat: Set LLM events custom attributes
MikeVaz Jul 16, 2024
4dd3be8
fix: Debug statements
MikeVaz Jul 16, 2024
55da778
fix: Example
MikeVaz Jul 16, 2024
720be07
fix: Add guards
MikeVaz Jul 16, 2024
878d6b5
fix: Add unit test
MikeVaz Jul 17, 2024
de8b498
feat: WithLlmCustomAttributes
MikeVaz Jul 22, 2024
dc0261a
feat: Merge parent context into children
RyanKadri Jul 23, 2024
07cbcab
Merge branch 'newrelic:main' into setLlmCustomAttributes
MikeVaz Jul 29, 2024
08927ec
feat: Option 4
MikeVaz Jul 31, 2024
8b0cd84
fix: Unnecessary test
MikeVaz Jul 31, 2024
7ab7bcd
fix: Test name
MikeVaz Jul 31, 2024
0b2f2d6
fix: Integration tests
MikeVaz Aug 1, 2024
f75dabf
fix: Remove extra npm scripts
MikeVaz Aug 2, 2024
19a9e80
fix: Remove extra npm scripts
MikeVaz Aug 2, 2024
24a13bd
fix: PR feedback
MikeVaz Aug 2, 2024
af4aafa
fix: PR feedback
MikeVaz Aug 2, 2024
d428aaa
fix: PR feedback
MikeVaz Aug 5, 2024
d2264ad
fix: Unit test and pr feedback
MikeVaz Aug 5, 2024
807f0cf
Merge branch 'newrelic:main' into setLlmCustomAttributes
MikeVaz Aug 13, 2024
ef488da
fix: PR feedback
MikeVaz Aug 13, 2024
bcc9e15
fix: Typo
MikeVaz Aug 13, 2024
cef4289
fix: PR feedback
MikeVaz Aug 13, 2024
87588e3
fix: PR feedback
MikeVaz Aug 13, 2024
d0101f6
fix: Unit test
MikeVaz Aug 13, 2024
dbd7282
Merge branch 'newrelic:main' into setLlmCustomAttributes
MikeVaz Aug 14, 2024
e7531d7
fix: Apply solution 1
MikeVaz Aug 14, 2024
308bdeb
Update lib/util/llm-utils.js
MikeVaz Aug 15, 2024
a32f932
fix: PR feedback
MikeVaz Aug 15, 2024
ba516b0
Merge branch 'newrelic:main' into setLlmCustomAttributes
MikeVaz Aug 16, 2024
6e629f9
fix: PR feedback
MikeVaz Aug 20, 2024
a8cbb8a
Merge branch 'newrelic:main' into setLlmCustomAttributes
MikeVaz Aug 20, 2024
6d3bd2d
fix: PR feedback
MikeVaz Aug 20, 2024
1f2d240
Update lib/instrumentation/openai.js
MikeVaz Aug 21, 2024
8cc326f
Update lib/instrumentation/langchain/common.js
MikeVaz Aug 21, 2024
cab2f74
Update lib/instrumentation/aws-sdk/v3/bedrock.js
MikeVaz Aug 21, 2024
67a35de
fix: Improve test coverage
MikeVaz Aug 21, 2024
d8353ff
Merge branch 'setLlmCustomAttributes' of https://github.com/MikeVaz/n…
MikeVaz Aug 21, 2024
cb423d6
fix: More unit test and PR feedback
MikeVaz Aug 21, 2024
21e4b5f
chore: Addressed code review feedback
bizob2828 Aug 22, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 48 additions & 0 deletions api.js
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@
const { DESTINATIONS } = require('./lib/config/attribute-filter')
const parse = require('module-details-from-path')
const { isSimpleObject } = require('./lib/util/objects')
const { AsyncLocalStorage } = require('async_hooks')

/*
*
Expand Down Expand Up @@ -1548,7 +1549,7 @@
* @param {string} params.traceId Identifier for the feedback event.
* Obtained from {@link getTraceMetadata}.
* @param {string} params.category A tag for the event.
* @param {string} params.rating A indicator of how useful the message was.

Check warning on line 1552 in api.js

View workflow job for this annotation

GitHub Actions / lint (lts/*)

The type 'getTraceMetadata' is undefined
* @param {string} [params.message] The message that triggered the event.
* @param {object} [params.metadata] Additional key-value pairs to associate
* with the recorded event.
Expand Down Expand Up @@ -1902,4 +1903,51 @@
transaction.ignoreApdex = true
}

/**

Check warning on line 1906 in api.js

View workflow job for this annotation

GitHub Actions / lint (lts/*)

Missing JSDoc @returns declaration
* Run a function with the passed in LLM context as the active context and return its return value.
*
* An example of setting a custom attribute:
*
* newrelic.withLlmCustomAttributes({'llm.someAttribute': 'someValue'}, () => {
* return;
* })
* @param {Object} context LLM custom attributes context
* @param {Function} callback The function to execute in context.
*/
API.prototype.withLlmCustomAttributes = function withLlmCustomAttributes(context, callback) {
const metric = this.agent.metrics.getOrCreateMetric(
NAMES.SUPPORTABILITY.API + '/withLlmCustomAttributes'
)
metric.incrementCallCount()

const transaction = this.agent.tracer.getTransaction()

if (!callback || typeof callback !== 'function') {
logger.warn('withLlmCustomAttributes must be used with a valid callback')
return
}

if (!transaction) {
logger.warn('withLlmCustomAttributes must be called within the scope of a transaction.')
return callback()
}

Check warning on line 1933 in api.js

View check run for this annotation

Codecov / codecov/patch

api.js#L1931-L1933

Added lines #L1931 - L1933 were not covered by tests

for (const [key, value] of Object.entries(context)) {
if (typeof value === 'object' || typeof value === 'function') {
logger.warn(`Invalid attribute type for ${key}. Skipped.`)
delete context[key]
} else if (key.indexOf('llm.') !== 0) {
logger.warn(`Invalid attribute name ${key}. Renamed to "llm.${key}".`)
delete context[key]
context[`llm.${key}`] = value
}
}

transaction._llmContextManager = transaction._llmContextManager || new AsyncLocalStorage()
const parentContext = transaction._llmContextManager.getStore()

const fullContext = Object.assign({}, parentContext || {}, context || {})
return transaction._llmContextManager.run(fullContext, callback)
}

module.exports = API
8 changes: 7 additions & 1 deletion lib/instrumentation/aws-sdk/v3/bedrock.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ const { DESTINATIONS } = require('../../../config/attribute-filter')
const { AI } = require('../../../metrics/names')
const { RecorderSpec } = require('../../../shim/specs')
const InstrumentationDescriptor = require('../../../instrumentation-descriptor')
const { extractLlmContext } = require('../../../util/llm-utils')

let TRACKING_METRIC

Expand Down Expand Up @@ -55,7 +56,12 @@ function isStreamingEnabled({ commandName, config }) {
*/
function recordEvent({ agent, type, msg }) {
msg.serialize()
agent.customEventAggregator.add([{ type, timestamp: Date.now() }, msg])
const llmContext = extractLlmContext(agent)

agent.customEventAggregator.add([
{ type, timestamp: Date.now() },
Object.assign({}, msg, llmContext)
])
}

/**
Expand Down
8 changes: 7 additions & 1 deletion lib/instrumentation/langchain/common.js
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
const {
AI: { LANGCHAIN }
} = require('../../metrics/names')
const { extractLlmContext } = require('../../util/llm-utils')

const common = module.exports

Expand Down Expand Up @@ -49,7 +50,12 @@ common.mergeMetadata = function mergeMetadata(localMeta = {}, paramsMeta = {}) {
*/
common.recordEvent = function recordEvent({ agent, type, msg, pkgVersion }) {
agent.metrics.getOrCreateMetric(`${LANGCHAIN.TRACKING_PREFIX}/${pkgVersion}`).incrementCallCount()
agent.customEventAggregator.add([{ type, timestamp: Date.now() }, msg])
const llmContext = extractLlmContext(agent)

agent.customEventAggregator.add([
{ type, timestamp: Date.now() },
Object.assign({}, msg, llmContext)
])
}

/**
Expand Down
8 changes: 7 additions & 1 deletion lib/instrumentation/openai.js
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ const {
LlmErrorMessage
} = require('../../lib/llm-events/openai')
const { RecorderSpec } = require('../../lib/shim/specs')
const { extractLlmContext } = require('../util/llm-utils')

const MIN_VERSION = '4.0.0'
const MIN_STREAM_VERSION = '4.12.2'
Expand Down Expand Up @@ -75,7 +76,12 @@ function decorateSegment({ shim, result, apiKey }) {
* @param {object} params.msg LLM event
*/
function recordEvent({ agent, type, msg }) {
agent.customEventAggregator.add([{ type, timestamp: Date.now() }, msg])
const llmContext = extractLlmContext(agent)

agent.customEventAggregator.add([
{ type, timestamp: Date.now() },
Object.assign({}, msg, llmContext)
])
}

/**
Expand Down
34 changes: 34 additions & 0 deletions lib/util/llm-utils.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
/*
* Copyright 2020 New Relic Corporation. All rights reserved.
* SPDX-License-Identifier: Apache-2.0
*/

'use strict'

exports = module.exports = { extractLlmContext, extractLlmAttribtues }

/**
* Extract LLM attributes from the LLM context
*
* @param {Object} context LLM context object
* @returns {Object} LLM custom attributes
*/
function extractLlmAttribtues(context) {
bizob2828 marked this conversation as resolved.
Show resolved Hide resolved
return Object.keys(context || {}).reduce((result, key) => {
bizob2828 marked this conversation as resolved.
Show resolved Hide resolved
if (key.indexOf('llm.') === 0) {
result[key] = context[key]
}
return result
}, {})
}

/**
* Extract LLM context from the active transaction
*
* @param {Agent} agent NR agent instance
* @returns {Object} LLM context object
*/
function extractLlmContext(agent) {
const context = agent.tracer.getTransaction()._llmContextManager?.getStore() || {}
return extractLlmAttribtues(context)
bizob2828 marked this conversation as resolved.
Show resolved Hide resolved
}
43 changes: 43 additions & 0 deletions test/unit/api/api-llm.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,49 @@ tap.test('Agent API LLM methods', (t) => {
})
})

t.test('withLlmCustomAttributes', (t) => {
const { api } = t.context
helper.runInTransaction(api.agent, (tx) => {
t.context.agent.tracer.getTransaction = () => {
return tx
}

t.doesNotThrow(() => {
bizob2828 marked this conversation as resolved.
Show resolved Hide resolved
api.withLlmCustomAttributes(null, null)
t.equal(loggerMock.warn.callCount, 1)
})

api.withLlmCustomAttributes(
{
'toRename': 'value1',
'llm.number': 1,
'llm.boolean': true,
'toDelete': () => {},
'toDelete2': {},
'toDelete3': []
},
() => {
const contextManager = tx._llmContextManager
const parentContext = contextManager.getStore()
t.equal(parentContext['llm.toRename'], 'value1')
t.notOk(parentContext.toDelete)
t.notOk(parentContext.toDelete2)
t.notOk(parentContext.toDelete3)
t.equal(parentContext['llm.number'], 1)
t.equal(parentContext['llm.boolean'], true)

api.withLlmCustomAttributes({ 'llm.someAttribute': 'someValue' }, () => {
const contextManager = tx._llmContextManager
const context = contextManager.getStore()
t.equal(context[`llm.toRename`], 'value1')
t.equal(context['llm.someAttribute'], 'someValue')
t.end()
})
}
)
})
})

t.test('setLlmTokenCount should register callback to calculate token counts', async (t) => {
const { api, agent } = t.context
function callback(model, content) {
Expand Down
2 changes: 1 addition & 1 deletion test/unit/api/stub.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
const tap = require('tap')
const API = require('../../../stub_api')

const EXPECTED_API_COUNT = 36
const EXPECTED_API_COUNT = 37

tap.test('Agent API - Stubbed Agent API', (t) => {
t.autoend()
Expand Down
1 change: 1 addition & 0 deletions test/unit/instrumentation/openai.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -119,5 +119,6 @@ test('openai unit tests', (t) => {
t.equal(isWrapped, false, 'should not wrap chat completions create')
t.end()
})

t.end()
})
50 changes: 50 additions & 0 deletions test/unit/util/llm-utils.test.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
/*
* Copyright 2023 New Relic Corporation. All rights reserved.
* SPDX-License-Identifier: Apache-2.0
*/

'use strict'

const tap = require('tap')
const { extractLlmAttribtues, extractLlmContext } = require('../../../lib/util/llm-utils')
const { AsyncLocalStorage } = require('async_hooks')

tap.test('extractLlmAttributes', (t) => {
const context = {
'skip': 1,
'llm.get': 2,
'fllm.skip': 3
}

const llmContext = extractLlmAttribtues(context)
t.notOk(llmContext.skip)
t.notOk(llmContext['fllm.skip'])
t.equal(llmContext['llm.get'], 2)
t.end()
})

tap.test('extractLlmContext', (t) => {
bizob2828 marked this conversation as resolved.
Show resolved Hide resolved
const tx = {
_llmContextManager: new AsyncLocalStorage()
}
const agent = {
tracer: {
getTransaction: () => {
return tx
}
}
}

tx._llmContextManager.run(null, () => {
const llmContext = extractLlmContext(agent)
t.equal(typeof llmContext, 'object')
t.equal(Object.entries(llmContext).length, 0)
})

tx._llmContextManager.run({ 'llm.test': 1, 'skip': 2 }, () => {
const llmContext = extractLlmContext(agent)
t.equal(llmContext['llm.test'], 1)
t.notOk(llmContext.skip)
t.end()
})
})
26 changes: 26 additions & 0 deletions test/versioned/aws-sdk-v3/bedrock-chat-completions.tap.js
Original file line number Diff line number Diff line change
Expand Up @@ -158,6 +158,32 @@ tap.afterEach(async (t) => {
}
)

tap.test(
`${modelId}: supports custom attributes on LlmChatCompletionMessage(s) and LlmChatCompletionSummary events`,
(t) => {
const { bedrock, client, agent } = t.context
const prompt = `text ${resKey} ultimate question`
const input = requests[resKey](prompt, modelId)
const command = new bedrock.InvokeModelCommand(input)

const api = helper.getAgentApi()
helper.runInTransaction(agent, async (tx) => {
api.addCustomAttribute('llm.conversation_id', 'convo-id')
api.withLlmCustomAttributes({ 'llm.contextAttribute': 'someValue' }, async () => {
jsumners-nr marked this conversation as resolved.
Show resolved Hide resolved
await client.send(command)
const events = agent.customEventAggregator.events.toArray()

const chatSummary = events.filter(([{ type }]) => type === 'LlmChatCompletionSummary')[0]
const [, message] = chatSummary
t.equal(message['llm.contextAttribute'], 'someValue')

tx.end()
t.end()
})
})
}
)

tap.test(`${modelId}: text answer (streamed)`, (t) => {
if (modelId.includes('ai21')) {
t.skip('model does not support streaming')
Expand Down
23 changes: 21 additions & 2 deletions test/versioned/langchain/runnables.tap.js
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,6 @@ tap.test('Langchain instrumentation - runnable sequence', (t) => {

t.test('should create langchain events for every invoke call', (t) => {
const { agent, prompt, outputParser, model } = t.context

helper.runInTransaction(agent, async (tx) => {
const input = { topic: 'scientist' }
const options = { metadata: { key: 'value', hello: 'world' }, tags: ['tag1', 'tag2'] }
Expand Down Expand Up @@ -95,11 +94,31 @@ tap.test('Langchain instrumentation - runnable sequence', (t) => {
})
})

t.test('should support custom attributes on the LLM events', (t) => {
const { agent, prompt, outputParser, model } = t.context
const api = helper.getAgentApi()
helper.runInTransaction(agent, async (tx) => {
api.withLlmCustomAttributes({ 'llm.contextAttribute': 'someValue' }, async () => {
const input = { topic: 'scientist' }
const options = { metadata: { key: 'value', hello: 'world' }, tags: ['tag1', 'tag2'] }

const chain = prompt.pipe(model).pipe(outputParser)
await chain.invoke(input, options)
const events = agent.customEventAggregator.events.toArray()

const [[, message]] = events
t.equal(message['llm.contextAttribute'], 'someValue')

tx.end()
t.end()
})
})
})

t.test(
'should create langchain events for every invoke call on chat prompt + model + parser',
(t) => {
const { agent, prompt, outputParser, model } = t.context

helper.runInTransaction(agent, async (tx) => {
const input = { topic: 'scientist' }
const options = { metadata: { key: 'value', hello: 'world' }, tags: ['tag1', 'tag2'] }
Expand Down
Loading
Loading