Opportunity Deep Dive
# Opportunity Deep Dive
## User Input
- **Target opportunity:** [Solicitation number, notice ID, GovTribe link, or opportunity title plus agency]
## Goal
Use GovTribe MCP tools to resolve a specific federal contract opportunity, notice, solicitation, sources sought, presolicitation, or related contracting notice and produce a grounded record-level brief.
Summarize the key notice facts, surface the most useful government files, and trace the opportunity to directly linked awards or incumbent signals only when the evidence supports that connection.
Focus on exact notice resolution and supporting evidence, not broad market speculation or open-opportunity scanning.
## Required Documentation
Before doing any work, call the **GovTribe Documentation** tool and read the documentation required for this workflow.
Required documentation to retrieve and read:
- `article_name="Search_Query_Guide"`
- `article_name="Search_Mode_Guide"`
- `article_name="Search_Federal_Contract_Opportunities_Tool"`
- `article_name="Search_Government_Files_Tool"`
Retrieve these additional documentation articles only when the workflow needs them:
- `article_name="Search_Federal_Contract_Awards_Tool"` for direct linked-award or incumbent trace
- `article_name="Aggregation_and_Leaderboard_Guide"` only for the linked-award or incumbent aggregation branch
- `article_name="Search_Federal_Contract_IDVs_Tool"` for parent-IDV context
- `article_name="Search_Federal_Contract_Vehicles_Tool"` for vehicle-level context
- `article_name="Search_Vendors_Tool"` for awardee identity normalization when a linked award is found
- `article_name="Vector_Store_Content_Retrieval_Guide"` only if file snippets are not enough and the user explicitly needs deeper document content
Documentation rules:
- Call the **GovTribe Documentation** tool before the first research or search step.
- Read every required documentation article before using other GovTribe tools.
- Follow the documented tool contracts exactly.
- Treat the documentation as binding for tool names, parameters, field definitions, relationship fields, sort keys, and valid output assumptions.
## Required Input
The user must provide a specific target opportunity before analysis begins.
Accept any of the following:
- Solicitation number or notice ID
- GovTribe link
- Opportunity title plus agency
- Plain-language description only if it is specific enough to resolve a single opportunity
Optional constraints the user may provide:
- Whether to include government files or attachments
- Whether to include linked awards or incumbent signals
- Whether to include vehicle or IDV context
- Whether the user wants a short **Opportunity Card** or a fuller due-diligence view
Input rules:
- If the input resolves cleanly to one target, proceed immediately.
- If the input is too vague to resolve a single opportunity, ask for the minimum missing detail needed to proceed.
- Do not guess the target.
- Do not start substantive analysis until the target is resolved.
## Workflow
### Rules
- Call `Documentation` before using any other GovTribe MCP tool.
- Contract-opportunity-first only; do not mix in grant workflows.
- Use GovTribe MCP tools as the primary evidence in this workflow.
- Always set both `search_mode` and `query` on every `Search_*` call.
- Prefer exact lookup before broader recovery attempts.
- Use `query: ""` for filter-defined or ID-defined cohorts.
- Use `per_page: 0`, `query: ""`, and `search_mode: "keyword"` only for the direct linked-award aggregation branch when the user needs incumbent or post-award history.
- Use `fields_to_return` whenever you need more than `govtribe_id`.
- Do not stop early when another tool call is required by the workflow.
- Keep calling tools until the task is complete or the tool budget is reached.
- If a tool returns empty or partial results and the workflow defines another defensible strategy, continue with that next strategy.
- Do not infer a linked award, incumbent, file implication, parent vehicle, or parent IDV unless the retrieved evidence supports it.
- Do not convert a market-similarity result into a confirmed notice trace.
- Do not attempt broad likely-bidder, competitor, customer-sentiment, or shaping analysis in this workflow.
- Keep the workflow compact; add an extra tool call only when it materially improves correctness or grounding.
### Steps
1. Before doing any research, call `Documentation` and read every required article listed above.
- Add optional documentation articles only when their branch becomes necessary.
- Use the documentation results to confirm valid tool names, `search_mode`, `query`, `fields_to_return`, relationship fields, and sort keys before searching.
2. Resolve the target opportunity with `Search_Federal_Contract_Opportunities`.
- Favor exact lookup first.
- For exact solicitation numbers, notice IDs, quoted titles, or exact GovTribe-derived identifiers, use `search_mode: "keyword"` and a quoted `query`.
- If the user provides a GovTribe link, use the record identity embedded in the link when possible; otherwise resolve it through exact quoted lookup.
- If the user provides an opportunity title plus agency, use the strongest exact title phrase plus agency context, then disambiguate with returned fields.
- Use `fields_to_return` explicitly. At minimum request:
- `govtribe_id`, `govtribe_url`, `govtribe_type`, `solicitation_number`, `name`, `opportunity_type`, `set_aside_type`, `posted_date`, `due_date`, `award_date`, `descriptions`, `govtribe_ai_summary`, `federal_meta_opportunity_id`, `federal_contract_vehicle`, `federal_agency`, `place_of_performance`, `naics_category`, `psc_category`, `government_files`, `federal_contract_awards`, `federal_contract_idvs`, `points_of_contact`
- Use date sorting only when multiple historical variants or amendment-related versions need review.
3. If multiple opportunities match, disambiguate with the minimum additional evidence needed.
- Compare agency, opportunity type, posted date, due date, set-aside, NAICS, PSC, and vehicle context.
- Do not merge multiple possible matches into one narrative.
- If the record still cannot be resolved to a single opportunity, stop and ask for one clarifying detail.
4. If exact resolution fails and the user input is still specific enough, run one narrow recovery retry before asking for clarification.
- Broaden only one dimension at a time, such as title phrasing or agency context.
- If the retry still does not produce a single defensible match, stop and ask for the minimum missing detail.
5. Pull government attachments tied to the resolved opportunity with `Search_Government_Files`.
- Use `federal_contract_opportunity_ids` with the resolved opportunity GovTribe ID.
- Request:
- `govtribe_id`, `govtribe_url`, `name`, `posted_date`, `content_snippet`, `download_url`, `parent_record`, `govtribe_ai_summary`
- Focus on files that materially explain the scope, amendments, Q&A, statements of work, attachments, or evaluation context.
- If no files are returned, say that clearly instead of implying the notice had no attachments.
6. If the user wants award trace, incumbent context, or post-award status, use only direct linkage evidence.
- Call `Documentation` for `article_name="Search_Federal_Contract_Awards_Tool"` before entering this branch.
- If the resolved opportunity has `federal_meta_opportunity_id`, call `Search_Federal_Contract_Awards` with `query: ""`, `search_mode: "keyword"`, and `federal_meta_opportunity_ids`.
- Request:
- `govtribe_id`, `govtribe_url`, `name`, `contract_number`, `award_date`, `completion_date`, `ultimate_completion_date`, `contract_type`, `descriptions`, `govtribe_ai_summary`, `dollars_obligated`, `ceiling_value`, `set_aside_type`, `awardee`, `parent_of_awardee`, `federal_contract_idv`, `federal_contract_vehicle`, `contracting_federal_agency`, `funding_federal_agency`, `naics_category`, `psc_category`, `place_of_performance`, `originating_federal_contract_opportunity`
- If multiple linked awards are returned or the user explicitly wants incumbent or post-award history, call `Documentation` for `article_name="Aggregation_and_Leaderboard_Guide"` and add an aggregation-only pass on the same direct linkage:
- Use `Search_Federal_Contract_Awards`
- Use `query: ""`
- Use `search_mode: "keyword"`
- Use `per_page: 0`
- Reuse the same `federal_meta_opportunity_ids`
- Use `aggregations` such as `dollars_obligated_stats`, `top_awardees_by_dollars_obligated`, and `top_contracting_federal_agencies_by_dollars_obligated`
- Use that aggregation branch only to summarize the direct historical award footprint tied to the notice thread and to distinguish a single obvious incumbent from a more distributed linked-award history.
- Only describe a linked award, incumbent, or post-award outcome when the connection is direct and returned by the dataset.
- Do not infer incumbent status from loose similarity, keywords, or adjacent awards.
7. Normalize awardee identity only when a linked award was found and identity context materially improves the answer.
- Call `Documentation` for `article_name="Search_Vendors_Tool"` before entering this branch.
- If awardee identity, parent-child structure, or legal-entity naming matters, use `Search_Vendors`.
- Prefer `vendor_ids` from award relationships when available.
- If no vendor ID is available, use an exact quoted company name in `Search_Vendors.query` with `search_mode: "keyword"`.
- Request `govtribe_id`, `govtribe_url`, `name`, `uei`, `dba`, `parent_or_child`, `parent`, `business_types`, `sba_certifications`, and `govtribe_ai_summary`.
- Use vendor normalization to clarify identity, not to replace opportunity or award evidence.
8. If the opportunity clearly references a vehicle or IDV and the user asked for that context, run one targeted parent lookup.
- IDV path with `Search_Federal_Contract_IDVs`:
- Call `Documentation` for `article_name="Search_Federal_Contract_IDVs_Tool"` before this path.
- Use the linked IDV when available, or an exact quoted contract number if that is all you have.
- Request `govtribe_id`, `govtribe_url`, `name`, `contract_number`, `award_date`, `last_date_to_order`, `contract_type`, `description`, `govtribe_ai_summary`, `ceiling_value`, `set_aside`, `multiple_or_single_award`, `awardee`, `parent_of_awardee`, `federal_contract_vehicle`, `contracting_federal_agency`, `funding_federal_agency`, `naics_category`, `psc_category`, `place_of_performance`, `task_orders`, `blanket_purchase_agreements`, and `originating_federal_contract_opportunity`
- Vehicle path with `Search_Federal_Contract_Vehicles`:
- Call `Documentation` for `article_name="Search_Federal_Contract_Vehicles_Tool"` before this path.
- Use the linked vehicle when available, or an exact quoted vehicle name if needed.
- Request `govtribe_id`, `govtribe_url`, `name`, `award_date`, `last_date_to_order`, `contract_type`, `descriptions`, `govtribe_ai_summary`, `set_aside_type`, `shared_ceiling`, `originating_federal_contract_opportunity`, `federal_agency`, and `federal_contract_awards`
- Do not call both parent tools unless the evidence requires both and the extra call materially improves correctness.
9. Escalate file-content retrieval only when it is clearly necessary.
- If `content_snippet` is not enough and the user explicitly needs deeper document grounding, call `Documentation` for `article_name="Vector_Store_Content_Retrieval_Guide"`, then use `Add_To_Vector_Store`, then `Search_Vector_Store`.
- If the user’s real goal becomes post-award contract tracing, switch to or recommend the **Award Deep Dive** workflow on the linked award rather than overloading this prompt.
- State clearly when no files, no linked awards, no parent context, or no confirmed incumbent can be shown from the retrieved record set.
## Tool Budget
Design the workflow to stay compact.
Typical path:
- 4 documentation calls
- 1 opportunity lookup
- 1 government files call
- 0 to 1 linked-awards call
- 0 to 1 vendor normalization call
- 0 to 1 parent IDV or vehicle call
Expected total:
- Typical: 6 to 8 calls
- High end: 9 to 11 calls
Avoid exceeding 15 calls unless an extra call materially changes correctness.
## Output Format
Use markdown tables for key facts, key files, and any multi-row linked-award summaries.
Do not use Mermaid charts in this workflow.
Return the answer in this order:
1. **Target Opportunity Summary**
- Briefly explain how the opportunity was resolved
2. **Key Notice Facts**
- Use a required 2-column markdown table for the key facts
- Include solicitation number, opportunity type, agency, posted and due dates, set-aside, NAICS/PSC, place of performance, vehicle context, and the most relevant scope language
3. **Key Government Files**
- Use a compact markdown table
- Recommended columns: `File`, `Posted`, `Why It Matters`
4. **Linked Awards / Incumbent Trace**
- Include only direct linked-award evidence
- If multiple linked awards were retrieved, you may use a compact markdown table
- If the linked-award aggregation branch was used, include a short aggregated history note
- If no direct link is confirmed, say that clearly
5. **Parent Vehicle or IDV Context**
- Include only if applicable and actually retrieved
6. **Risks, Gaps, or Unknowns**
- Briefly note identity ambiguity, missing attachments, unclear linkage, or other data limits
7. **Overall Confidence**
- State overall confidence and why
## Citation Rules
- Only cite sources retrieved in the current workflow.
- Never fabricate citations, URLs, IDs, or quote spans.
- Use exactly the citation format required by the host application.
- Attach citations to the specific claims they support, not only at the end.
## Grounding Rules
- Base claims only on provided context or GovTribe MCP tool outputs.
- If sources conflict, state the conflict explicitly and attribute each side.
- If the context is insufficient or irrelevant, narrow the answer or state that the goal cannot be fully completed from the available evidence.
- If a statement is an inference rather than a directly supported fact, label it as an inference.Last updated
Was this helpful?
