Skip to content

Commit

Permalink
Removing support for topK property in PromptModelInferenceConfigurati…
Browse files Browse the repository at this point in the history
…on object, Making PromptTemplateConfiguration property as required, Limiting the maximum PromptVariant to 1
  • Loading branch information
aws-sdk-dotnet-automation committed Oct 17, 2024
1 parent 842a716 commit 727ecb2
Show file tree
Hide file tree
Showing 21 changed files with 23 additions and 64 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -4525,7 +4525,6 @@
"maxTokens":{"shape":"MaximumLength"},
"stopSequences":{"shape":"StopSequences"},
"temperature":{"shape":"Temperature"},
"topK":{"shape":"TopK"},
"topP":{"shape":"TopP"}
}
},
Expand Down Expand Up @@ -4599,6 +4598,7 @@
"type":"structure",
"required":[
"name",
"templateConfiguration",
"templateType"
],
"members":{
Expand All @@ -4614,7 +4614,7 @@
"PromptVariantList":{
"type":"list",
"member":{"shape":"PromptVariant"},
"max":3,
"max":1,
"min":0,
"sensitive":true
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@
"GetFlow": "<p>Retrieves information about a flow. For more information, see <a href=\"https://docs.aws.amazon.com/bedrock/latest/userguide/flows-manage.html\">Manage a flow in Amazon Bedrock</a> in the Amazon Bedrock User Guide.</p>",
"GetFlowAlias": "<p>Retrieves information about a flow. For more information, see <a href=\"https://docs.aws.amazon.com/bedrock/latest/userguide/flows-deploy.html\">Deploy a flow in Amazon Bedrock</a> in the Amazon Bedrock User Guide.</p>",
"GetFlowVersion": "<p>Retrieves information about a version of a flow. For more information, see <a href=\"https://docs.aws.amazon.com/bedrock/latest/userguide/flows-deploy.html\">Deploy a flow in Amazon Bedrock</a> in the Amazon Bedrock User Guide.</p>",
"GetIngestionJob": "<p>Gets information about a data ingestion job. Data sources are ingested into your knowledge base so that Large Lanaguage Models (LLMs) can use your data.</p>",
"GetIngestionJob": "<p>Gets information about a data ingestion job. Data sources are ingested into your knowledge base so that Large Language Models (LLMs) can use your data.</p>",
"GetKnowledgeBase": "<p>Gets information about a knoweldge base.</p>",
"GetPrompt": "<p>Retrieves information about the working draft (<code>DRAFT</code> version) of a prompt or a version of it, depending on whether you include the <code>promptVersion</code> field or not. For more information, see <a href=\"https://docs.aws.amazon.com/bedrock/latest/userguide/prompt-management-manage.html#prompt-management-view.html\">View information about prompts using Prompt management</a> and <a href=\"https://docs.aws.amazon.com/bedrock/latest/userguide/prompt-management-deploy.html#prompt-management-versions-view.html\">View information about a version of your prompt</a> in the Amazon Bedrock User Guide.</p>",
"ListAgentActionGroups": "<p>Lists the action groups for an agent and information about each one.</p>",
Expand Down Expand Up @@ -3010,8 +3010,7 @@
"TopK": {
"base": null,
"refs": {
"InferenceConfiguration$topK": "<p>While generating a response, the model determines the probability of the following token at each point of generation. The value that you set for <code>topK</code> is the number of most-likely candidates from which the model chooses the next token in the sequence. For example, if you set <code>topK</code> to 50, the model selects the next token from among the top 50 most likely choices.</p>",
"PromptModelInferenceConfiguration$topK": "<p>The number of most-likely candidates that the model considers for the next token during generation.</p>"
"InferenceConfiguration$topK": "<p>While generating a response, the model determines the probability of the following token at each point of generation. The value that you set for <code>topK</code> is the number of most-likely candidates from which the model chooses the next token in the sequence. For example, if you set <code>topK</code> to 50, the model selects the next token from among the top 50 most likely choices.</p>"
}
},
"TopP": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -636,7 +636,7 @@
{"shape":"InternalServerException"},
{"shape":"ResourceNotFoundException"}
],
"documentation":"<p>Gets information about a data ingestion job. Data sources are ingested into your knowledge base so that Large Lanaguage Models (LLMs) can use your data.</p>"
"documentation":"<p>Gets information about a data ingestion job. Data sources are ingested into your knowledge base so that Large Language Models (LLMs) can use your data.</p>"
},
"GetKnowledgeBase":{
"name":"GetKnowledgeBase",
Expand Down Expand Up @@ -6469,10 +6469,6 @@
"shape":"Temperature",
"documentation":"<p>Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.</p>"
},
"topK":{
"shape":"TopK",
"documentation":"<p>The number of most-likely candidates that the model considers for the next token during generation.</p>"
},
"topP":{
"shape":"TopP",
"documentation":"<p>The percentage of most-likely candidates that the model considers for the next token.</p>"
Expand Down Expand Up @@ -6583,6 +6579,7 @@
"type":"structure",
"required":[
"name",
"templateConfiguration",
"templateType"
],
"members":{
Expand Down Expand Up @@ -6617,7 +6614,7 @@
"PromptVariantList":{
"type":"list",
"member":{"shape":"PromptVariant"},
"max":3,
"max":1,
"min":0,
"sensitive":true
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2181,11 +2181,6 @@
<min>0</min>
<max>1</max>
</property-value-rule>
<property-value-rule>
<property>Amazon.BedrockAgent.Model.PromptModelInferenceConfiguration.TopK</property>
<min>0</min>
<max>500</max>
</property-value-rule>
<property-value-rule>
<property>Amazon.BedrockAgent.Model.PromptModelInferenceConfiguration.TopP</property>
<min>0</min>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,7 @@ internal bool IsSetTags()
/// A list of objects, each containing details about a variant of the prompt.
/// </para>
/// </summary>
[AWSProperty(Sensitive=true, Min=0, Max=3)]
[AWSProperty(Sensitive=true, Min=0, Max=1)]
public List<PromptVariant> Variants
{
get { return this._variants; }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -202,7 +202,7 @@ internal bool IsSetUpdatedAt()
/// A list of objects, each containing details about a variant of the prompt.
/// </para>
/// </summary>
[AWSProperty(Sensitive=true, Min=0, Max=3)]
[AWSProperty(Sensitive=true, Min=0, Max=1)]
public List<PromptVariant> Variants
{
get { return this._variants; }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -204,7 +204,7 @@ internal bool IsSetUpdatedAt()
/// A list of objects, each containing details about a variant of the prompt.
/// </para>
/// </summary>
[AWSProperty(Sensitive=true, Min=0, Max=3)]
[AWSProperty(Sensitive=true, Min=0, Max=1)]
public List<PromptVariant> Variants
{
get { return this._variants; }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ namespace Amazon.BedrockAgent.Model
/// <summary>
/// Container for the parameters to the GetIngestionJob operation.
/// Gets information about a data ingestion job. Data sources are ingested into your knowledge
/// base so that Large Lanaguage Models (LLMs) can use your data.
/// base so that Large Language Models (LLMs) can use your data.
/// </summary>
public partial class GetIngestionJobRequest : AmazonBedrockAgentRequest
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -205,7 +205,7 @@ internal bool IsSetUpdatedAt()
/// A list of objects, each containing details about a variant of the prompt.
/// </para>
/// </summary>
[AWSProperty(Sensitive=true, Min=0, Max=3)]
[AWSProperty(Sensitive=true, Min=0, Max=1)]
public List<PromptVariant> Variants
{
get { return this._variants; }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,12 +78,6 @@ public void Marshall(PromptModelInferenceConfiguration requestObject, JsonMarsha
}
}

if(requestObject.IsSetTopK())
{
context.Writer.WritePropertyName("topK");
context.Writer.Write(requestObject.TopK);
}

if(requestObject.IsSetTopP())
{
context.Writer.WritePropertyName("topP");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -84,12 +84,6 @@ public PromptModelInferenceConfiguration Unmarshall(JsonUnmarshallerContext cont
unmarshalledObject.Temperature = unmarshaller.Unmarshall(context);
continue;
}
if (context.TestExpression("topK", targetDepth))
{
var unmarshaller = IntUnmarshaller.Instance;
unmarshalledObject.TopK = unmarshaller.Unmarshall(context);
continue;
}
if (context.TestExpression("topP", targetDepth))
{
var unmarshaller = FloatUnmarshaller.Instance;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,6 @@ public partial class PromptModelInferenceConfiguration
private int? _maxTokens;
private List<string> _stopSequences = AWSConfigs.InitializeCollections ? new List<string>() : null;
private float? _temperature;
private int? _topk;
private float? _topp;

/// <summary>
Expand Down Expand Up @@ -100,26 +99,6 @@ internal bool IsSetTemperature()
return this._temperature.HasValue;
}

/// <summary>
/// Gets and sets the property TopK.
/// <para>
/// The number of most-likely candidates that the model considers for the next token during
/// generation.
/// </para>
/// </summary>
[AWSProperty(Min=0, Max=500)]
public int TopK
{
get { return this._topk.GetValueOrDefault(); }
set { this._topk = value; }
}

// Check to see if TopK property is set
internal bool IsSetTopK()
{
return this._topk.HasValue;
}

/// <summary>
/// Gets and sets the property TopP.
/// <para>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -125,6 +125,7 @@ internal bool IsSetName()
/// Contains configurations for the prompt template.
/// </para>
/// </summary>
[AWSProperty(Required=true)]
public PromptTemplateConfiguration TemplateConfiguration
{
get { return this._templateConfiguration; }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ internal bool IsSetPromptIdentifier()
/// A list of objects, each containing details about a variant of the prompt.
/// </para>
/// </summary>
[AWSProperty(Sensitive=true, Min=0, Max=3)]
[AWSProperty(Sensitive=true, Min=0, Max=1)]
public List<PromptVariant> Variants
{
get { return this._variants; }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -204,7 +204,7 @@ internal bool IsSetUpdatedAt()
/// A list of objects, each containing details about a variant of the prompt.
/// </para>
/// </summary>
[AWSProperty(Sensitive=true, Min=0, Max=3)]
[AWSProperty(Sensitive=true, Min=0, Max=1)]
public List<PromptVariant> Variants
{
get { return this._variants; }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2675,7 +2675,7 @@ public virtual GetFlowVersionResponse EndGetFlowVersion(IAsyncResult asyncResult

/// <summary>
/// Gets information about a data ingestion job. Data sources are ingested into your knowledge
/// base so that Large Lanaguage Models (LLMs) can use your data.
/// base so that Large Language Models (LLMs) can use your data.
/// </summary>
/// <param name="request">Container for the necessary parameters to execute the GetIngestionJob service method.</param>
///
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2023,7 +2023,7 @@ public partial interface IAmazonBedrockAgent : IAmazonService, IDisposable

/// <summary>
/// Gets information about a data ingestion job. Data sources are ingested into your knowledge
/// base so that Large Lanaguage Models (LLMs) can use your data.
/// base so that Large Language Models (LLMs) can use your data.
/// </summary>
/// <param name="request">Container for the necessary parameters to execute the GetIngestionJob service method.</param>
///
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3006,7 +3006,7 @@ public virtual GetFlowVersionResponse GetFlowVersion(GetFlowVersionRequest reque

/// <summary>
/// Gets information about a data ingestion job. Data sources are ingested into your knowledge
/// base so that Large Lanaguage Models (LLMs) can use your data.
/// base so that Large Language Models (LLMs) can use your data.
/// </summary>
/// <param name="request">Container for the necessary parameters to execute the GetIngestionJob service method.</param>
///
Expand Down Expand Up @@ -3040,7 +3040,7 @@ public virtual GetIngestionJobResponse GetIngestionJob(GetIngestionJobRequest re

/// <summary>
/// Gets information about a data ingestion job. Data sources are ingested into your knowledge
/// base so that Large Lanaguage Models (LLMs) can use your data.
/// base so that Large Language Models (LLMs) can use your data.
/// </summary>
/// <param name="request">Container for the necessary parameters to execute the GetIngestionJob service method.</param>
/// <param name="cancellationToken">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2382,7 +2382,7 @@ public partial interface IAmazonBedrockAgent : IAmazonService, IDisposable

/// <summary>
/// Gets information about a data ingestion job. Data sources are ingested into your knowledge
/// base so that Large Lanaguage Models (LLMs) can use your data.
/// base so that Large Language Models (LLMs) can use your data.
/// </summary>
/// <param name="request">Container for the necessary parameters to execute the GetIngestionJob service method.</param>
///
Expand Down Expand Up @@ -2410,7 +2410,7 @@ public partial interface IAmazonBedrockAgent : IAmazonService, IDisposable

/// <summary>
/// Gets information about a data ingestion job. Data sources are ingested into your knowledge
/// base so that Large Lanaguage Models (LLMs) can use your data.
/// base so that Large Language Models (LLMs) can use your data.
/// </summary>
/// <param name="request">Container for the necessary parameters to execute the GetIngestionJob service method.</param>
/// <param name="cancellationToken">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2070,7 +2070,7 @@ internal virtual GetIngestionJobResponse GetIngestionJob(GetIngestionJobRequest

/// <summary>
/// Gets information about a data ingestion job. Data sources are ingested into your knowledge
/// base so that Large Lanaguage Models (LLMs) can use your data.
/// base so that Large Language Models (LLMs) can use your data.
/// </summary>
/// <param name="request">Container for the necessary parameters to execute the GetIngestionJob service method.</param>
/// <param name="cancellationToken">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1340,7 +1340,7 @@ public partial interface IAmazonBedrockAgent : IAmazonService, IDisposable

/// <summary>
/// Gets information about a data ingestion job. Data sources are ingested into your knowledge
/// base so that Large Lanaguage Models (LLMs) can use your data.
/// base so that Large Language Models (LLMs) can use your data.
/// </summary>
/// <param name="request">Container for the necessary parameters to execute the GetIngestionJob service method.</param>
/// <param name="cancellationToken">
Expand Down

0 comments on commit 727ecb2

Please sign in to comment.