Federal Contract Opportunity Deep Dive
# Federal Contract Opportunity Deep Dive
## User Input
- **Target opportunity:** [Solicitation number, GovTribe link, or opportunity title plus agency]
## Goal
Use GovTribe MCP tools to resolve a specific federal contract opportunity, notice, solicitation, sources sought, presolicitation, or related contracting notice and produce a grounded record-level brief.
Summarize the key notice facts, surface the most useful government files, trace the opportunity to directly linked awards or incumbent signals when the evidence supports that connection, and include parent vehicle or IDV context when the resolved notice clearly references it.
## Required Input
The user must provide a specific target opportunity before analysis begins.
Accept any of the following:
- Solicitation number
- GovTribe link
- Opportunity title plus agency
- Plain-language description only if it is specific enough to resolve a single opportunity
Input rules:
- If the input resolves cleanly to one target, proceed immediately.
- If the input is too vague to resolve a single opportunity, ask for the minimum missing detail needed to proceed.
- Government files, direct linked-award or incumbent trace, and vehicle or IDV context are part of the default workflow when the retrieved evidence supports them.
- Do not guess the target.
- Do not start substantive analysis until the target is resolved.
## Workflow
### Steps
1. Call `Documentation` once with `article_names=["Search_Query_Guide", "Search_Mode_Guide", "Aggregation_and_Leaderboard_Guide", "Vector_Store_Content_Retrieval_Guide"]`.
- Use the documentation results to confirm valid tool names, `search_mode`, `query`, `fields_to_return`, relationship fields, and sort keys before searching.
2. Resolve the target opportunity with `Search_Federal_Contract_Opportunities`.
- Favor exact lookup first.
- For exact solicitation numbers, quoted titles, or exact GovTribe-derived identifiers, use a double quoted `"query"`.
- If the user provides a GovTribe link, use the record identity embedded in the link when possible; otherwise resolve it through exact quoted lookup.
- If the user provides an opportunity title plus agency, use the strongest exact title phrase plus agency context, then disambiguate with returned fields.
- Use `fields_to_return` explicitly. At minimum request:
- `govtribe_id`, `govtribe_url`, `govtribe_type`, `solicitation_number`, `name`, `opportunity_type`, `set_aside_type`, `posted_date`, `due_date`, `award_date`, `descriptions`, `govtribe_ai_summary`, `federal_meta_opportunity_id`, `federal_contract_vehicle`, `federal_agency`, `place_of_performance`, `naics_category`, `psc_category`, `government_files`, `federal_contract_awards`, `federal_contract_idvs`, `points_of_contact`
- Use date sorting only when multiple historical variants or amendment-related versions need review.
3. If multiple opportunities match, disambiguate with the minimum additional evidence needed.
- Compare agency, opportunity type, posted date, due date, set-aside, NAICS, PSC, and vehicle context.
- Do not merge multiple possible matches into one narrative.
- If the record still cannot be resolved to a single opportunity, stop and ask for one clarifying detail.
4. If exact resolution fails and the user input is still specific enough, run one narrow recovery retry before asking for clarification.
- Broaden only one dimension at a time, such as title phrasing or agency context.
- If the retry still does not produce a single defensible match, stop and ask for the minimum missing detail.
5. Pull government attachments tied to the resolved opportunity with `Search_Government_Files`.
- Use `federal_contract_opportunity_ids` with the resolved opportunity GovTribe ID.
- Request:
- `govtribe_id`, `govtribe_url`, `name`, `posted_date`, `content_snippet`, `download_url`, `parent_record`, `govtribe_ai_summary`
- Focus on files that materially explain the scope, amendments, Q&A, statements of work, attachments, or evaluation context.
- If no files are returned, say that clearly instead of implying the notice had no attachments.
6. Use only direct linkage evidence for linked-award, incumbent, and post-award context.
- If the resolved opportunity has `federal_meta_opportunity_id`, call `Search_Federal_Contract_Awards` with `federal_meta_opportunity_ids`.
- Request:
- `govtribe_id`, `govtribe_url`, `name`, `contract_number`, `award_date`, `completion_date`, `ultimate_completion_date`, `contract_type`, `descriptions`, `govtribe_ai_summary`, `dollars_obligated`, `ceiling_value`, `set_aside_type`, `awardee`, `parent_of_awardee`, `federal_contract_idv`, `federal_contract_vehicle`, `contracting_federal_agency`, `funding_federal_agency`, `naics_category`, `psc_category`, `place_of_performance`, `originating_federal_contract_opportunity`
- If multiple linked awards are returned and concentration or post-award interpretation matters, add an aggregation-only pass on the same direct linkage:
- Use `Search_Federal_Contract_Awards`
- Use `per_page: 0`
- Reuse the same `federal_meta_opportunity_ids`
- Use `aggregations` such as `dollars_obligated_stats`, `top_awardees_by_dollars_obligated`, and `top_contracting_federal_agencies_by_dollars_obligated`
- Use that aggregation branch only to summarize the direct historical award footprint tied to the notice thread and to distinguish a single obvious incumbent from a more distributed linked-award history.
- Only describe a linked award, incumbent, or post-award outcome when the connection is direct and returned by the dataset.
- Do not infer incumbent status from loose similarity, keywords, or adjacent awards.
7. Normalize awardee identity only when a linked award was found and identity context materially improves the answer.
- If awardee identity, parent-child structure, or legal-entity naming matters, use `Search_Vendors`.
- Prefer `vendor_ids` from award relationships when available.
- If no vendor ID is available, use an exact quoted company name in `Search_Vendors.query`.
- Request `govtribe_id`, `govtribe_url`, `name`, `uei`, `dba`, `parent_or_child`, `parent`, `business_types`, `sba_certifications`, and `govtribe_ai_summary`.
- Use vendor normalization to clarify identity, not to replace opportunity or award evidence.
8. If the opportunity clearly references a vehicle or IDV, run one targeted parent lookup.
- IDV path with `Search_Federal_Contract_IDVs`:
- Use the linked IDV when available, or an exact quoted contract number if that is all you have.
- Request `govtribe_id`, `govtribe_url`, `name`, `contract_number`, `award_date`, `last_date_to_order`, `contract_type`, `description`, `govtribe_ai_summary`, `ceiling_value`, `set_aside`, `multiple_or_single_award`, `awardee`, `parent_of_awardee`, `federal_contract_vehicle`, `contracting_federal_agency`, `funding_federal_agency`, `naics_category`, `psc_category`, `place_of_performance`, `task_orders`, `blanket_purchase_agreements`, and `originating_federal_contract_opportunity`
- Vehicle path with `Search_Federal_Contract_Vehicles`:
- Use the linked vehicle when available, or an exact quoted vehicle name if needed.
- Request `govtribe_id`, `govtribe_url`, `name`, `award_date`, `last_date_to_order`, `contract_type`, `descriptions`, `govtribe_ai_summary`, `set_aside_type`, `shared_ceiling`, `originating_federal_contract_opportunity`, `federal_agency`, and `federal_contract_awards`
- Do not call both parent tools unless the evidence requires both and the extra call materially improves correctness.
9. Escalate file-content retrieval only when it is clearly necessary.
- If `content_snippet` is not enough and the user explicitly needs deeper document grounding, use `Add_To_Vector_Store`, then `Search_Vector_Store`.
- State clearly when no files, no linked awards, no parent context, or no confirmed incumbent can be shown from the retrieved record set.
## Output Format
Use markdown tables for key facts, key files, and any multi-row linked-award summaries.
Return the answer in this order:
1. **Target Opportunity Summary**
- Briefly explain how the opportunity was resolved
2. **Key Notice Facts**
- Use a required 2-column markdown table for the key facts
- Include solicitation number, opportunity type, agency, posted and due dates, set-aside, NAICS/PSC, place of performance, vehicle context, and the most relevant scope language
3. **Key Government Files**
- Use a compact markdown table
- Recommended columns: `File`, `Posted`, `Why It Matters`
4. **Linked Awards / Incumbent Trace**
- Include only direct linked-award evidence
- If multiple linked awards were retrieved, you may use a compact markdown table
- If the linked-award aggregation branch was used, include a short aggregated history note
- If no direct link is confirmed, say that clearly
5. **Parent Vehicle or IDV Context**
- Include only if applicable and actually retrieved
6. **Risks, Gaps, or Unknowns**
- Briefly note identity ambiguity, missing attachments, unclear linkage, or other data limits
7. **Overall Confidence**
- State overall confidence and why
## Citation Rules
- Only cite sources retrieved in the current workflow.
- Never fabricate citations, URLs, IDs, or quote spans.
- Use exactly the citation format required by the host application.
- Attach citations to the specific claims they support, not only at the end.
## Grounding Rules
- Base claims only on provided context or GovTribe MCP tool outputs.
- If sources conflict, state the conflict explicitly and attribute each side.
- If the context is insufficient or irrelevant, narrow the answer or state that the goal cannot be fully completed from the available evidence.
- If a statement is an inference rather than a directly supported fact, label it as an inference.Last updated
Was this helpful?
