Relevant Opportunities
# Relevant Opportunities
## User Input
- **Target profile:** [Company, product, capability, website, or offering description]
## Goal
Use GovTribe MCP tools to identify the open federal contract or grant opportunities most relevant to a target company, solution, or capability profile.
Prioritize opportunities the target is plausibly positioned to pursue, then rank them with grounded evidence, explicit caveats, and clear confidence.
## Required Documentation
Before doing any work, call the **GovTribe Documentation** tool and read the documentation required for this workflow.
Required documentation to retrieve and read:
- `article_name="Search_Query_Guide"`
- `article_name="Search_Mode_Guide"`
- `article_name="Search_Vendors_Tool"`
- `article_name="Search_Federal_Contract_Opportunities_Tool"`
- `article_name="Search_Federal_Grant_Opportunities_Tool"`
Documentation rules:
- Call the **GovTribe Documentation** tool before the first research or search step.
- Read every required documentation article before using other GovTribe tools.
- Add `article_name="Aggregation_and_Leaderboard_Guide"` when market sizing, narrowing, or concentration checks will affect the answer.
- Add `article_name="Date_Filtering_Guide"` or `article_name="Location_Filtering_Guide"` when you use date or location filters.
- Add `article_name="Search_Federal_Contract_Awards_Tool"` or `article_name="Search_Federal_Grant_Awards_Tool"` when incumbency, recompete, or grant-history validation will affect the ranking.
- Add `article_name="Search_Federal_Agencies_Tool"`, `article_name="Search_Naics_Categories_Tool"`, `article_name="Search_Psc_Categories_Tool"`, `article_name="Search_Federal_Grant_Programs_Tool"`, `article_name="Search_User_Files_Tool"`, `article_name="Search_Government_Files_Tool"`, or `article_name="Create_Saved_Search_Tool"` when those tools are needed for ID resolution, supporting evidence, or saved-search creation.
- Add `article_name="Vector_Store_Content_Retrieval_Guide"` before using `Add_To_Vector_Store` or `Search_Vector_Store`.
- Treat the documentation as binding for tool names, parameters, field definitions, valid filters, and output assumptions.
## Required Input
The user must provide a target company, solution, or capability profile before analysis begins.
Accept any of the following:
- Company name
- GovTribe link
- Company website
- Capability statement, one-pager, or other uploaded material
- Plain-language description of the company’s offerings
Optional constraints the user may provide:
- Agencies or customers of interest
- Geographic scope
- NAICS, PSC, or other classifications
- Contract type, funding instrument, or vehicle preferences
- Set-aside or eligibility preferences
- Due date window
- Incumbent or recompete preference
- Value range
- Clearances, certifications, or deployment requirements
- Whether to create a saved search or alert
Input rules:
- If the target profile resolves cleanly, proceed immediately.
- If the target is too vague to search well, ask for the minimum missing detail needed to proceed.
- Do not guess the target or silently broaden it into a generic market scan.
- Do not start substantive opportunity analysis until the target profile is resolved.
## Workflow
### Rules
- Call `Documentation` before doing any work and before using any other GovTribe tool.
- Restrict the workflow to open federal contract opportunities and federal grant opportunities; use awards only for validation branches that materially improve the ranking.
- Always set both `search_mode` and `query` on every `Search_*` call.
- Use `fields_to_return` whenever you need more than `govtribe_id`.
- Do not stop early when another tool call is required by the workflow.
- Keep calling tools until the task is complete or the tool budget is reached.
- If a tool returns empty or partial results and the workflow defines another defensible strategy, continue with that next strategy.
- Use `query: ""` only when filters define the cohort.
- Use `per_page: 0` only for aggregation-only cohorts.
- Resolve the target with `Search_Vendors` when the input is a company name, GovTribe link, or other known vendor identity.
- Use `Search_User_Files` for user-provided files and `Search_Government_Files` for opportunity file evidence when snippets or metadata materially improve fit judgments.
- Use `Add_To_Vector_Store` and `Search_Vector_Store` only after the documented escalation path when snippets are insufficient.
- Start with keyword and structured-filter retrieval before semantic expansion.
- Do not keep or over-rank opportunities that are broad, weakly related, or supported mostly by keyword overlap.
- If evidence is insufficient after review, say so clearly and stop.
### Steps
1. Call `Documentation` before any other GovTribe tool, read the required articles, and add the optional articles needed for the exact path you will run.
2. Resolve the target company, solution, or capability profile and extract the strongest reusable signals.
- Use `Search_Vendors` when the input is a company name, GovTribe link, or otherwise appears to map to a vendor record.
- For exact company names or identifiers, use `search_mode: "keyword"` with a quoted `query`.
- Request the fields needed to interpret the target, including identifiers, business types, certifications, classifications, parent-child context, descriptions or summaries, and linked award context when needed.
- Capture as many of these signals as possible: company name, core offerings, product versus services versus software versus R&D profile, customer or mission alignment, NAICS, PSC, certifications, set-aside status, geography, likely value band, technical keywords, and recent relevant awards when they materially sharpen fit.
- If the user supplied only a website URL, treat website content as usable only when that content is already present in user-provided context or available through an allowed file workflow.
3. Review text evidence in addition to structured fields.
- Use `Search_User_Files` for uploaded capability statements, one-pagers, and other user-provided files.
- Request `fields_to_return` such as `name`, `description`, `content_snippet`, `download_url`, and `govtribe_ai_summary` when file evidence is needed.
- If `content_snippet` is insufficient, read `article_name="Vector_Store_Content_Retrieval_Guide"` first, then use `Add_To_Vector_Store`, then `Search_Vector_Store`.
- Keep the strongest technical phrases, mission language, and delivery-model clues for later `query` construction.
4. Resolve filter IDs before broad opportunity retrieval when the user supplied structured constraints.
- Use `Search_Federal_Agencies` for `federal_agency_ids`.
- Use `Search_Naics_Categories` for `naics_category_ids`.
- Use `Search_Psc_Categories` for `psc_category_ids`.
- Use `Search_Federal_Grant_Programs` for `federal_grant_program_ids`.
5. Run an aggregation-first market-sizing and narrowing pass when the initial scoped market is broad or when concentration checks will improve the answer.
- Use `query: ""`, `search_mode: "keyword"`, and `per_page: 0` when filters define the cohort.
- Contract path:
- Use `Search_Federal_Contract_Opportunities`.
- Apply concrete filters such as `federal_agency_ids`, `naics_category_ids`, `psc_category_ids`, `set_aside_types`, `place_of_performance_ids`, `due_date_range`, and `opportunity_types`.
- Use `aggregations` such as `top_federal_agencies_by_doc_count`, `top_set_aside_types_by_doc_count`, `top_locations_by_doc_count`, `top_naics_codes_by_doc_count`, and `top_psc_codes_by_doc_count`.
- Grant path:
- Use `Search_Federal_Grant_Opportunities`.
- Apply concrete filters such as `federal_agency_ids`, `federal_grant_program_ids`, `funding_instrument_types`, `funding_activity_categories`, `due_date_range`, and `opportunity_types`.
- Use `aggregations` such as `top_federal_agencies_by_doc_count`, `top_federal_grant_programs_by_doc_count`, and `top_points_of_contact_by_doc_count` when outreach density or filing complexity materially affects the answer.
- Use the aggregation results to estimate market size, describe whether the scoped market is concentrated or fragmented, and tighten filters before row-by-row review when the cohort is still too broad.
6. Run a keyword and structured-filter opportunity retrieval pass before semantic expansion.
- Do not proceed to Step 7 until you have retrieved and reviewed row-level results from the Step 5 cohort.
- If the Step 5 cohort is 200 results or fewer, paginate through the full cohort before any semantic pass.
- Contract path:
- Use `Search_Federal_Contract_Opportunities`.
- Use `query: ""` when filters define the cohort.
- Request `fields_to_return` explicitly, including `govtribe_id`, `govtribe_url`, `name`, `solicitation_number`, `descriptions`, `govtribe_ai_summary`, `federal_agency`, `naics_category`, `psc_category`, `set_aside_type`, `federal_contract_vehicle`, `place_of_performance`, `posted_date`, `due_date`, `government_files`, and `points_of_contact`.
- Grant path:
- Use `Search_Federal_Grant_Opportunities`.
- Use `query: ""` when filters define the cohort.
- Request `fields_to_return` explicitly, including `govtribe_id`, `govtribe_url`, `name`, `solicitation_number`, `description`, `govtribe_ai_summary`, `federal_agency`, `federal_grant_programs`, `funding_instruments`, `applicant_types`, `funding_activity_categories`, `posted_date`, `due_date`, `government_files`, and `points_of_contact`.
- If the user requested only open or active results, make that an explicit filter decision rather than a loose interpretation.
7. Broaden only after the keyword and filter-first pass.
- Semantic search is a broadening pass only. It must not replace the Step 6 keyword and structured-filter row retrieval pass.
- Use the same opportunity search tool with `search_mode: "semantic"` and a concise plain-language `query` built from the resolved capability profile, delivery model, mission language, and domain synonyms.
- Keep the strongest structured filters in place while broadening.
- Use `_score`-based `sort` for semantic passes unless the user specifically needs due-date ordering or another sort key.
- Use `per_page` and follow-up calls as needed. Do not stop because the first page contains plausible matches.
8. Compare each candidate opportunity to the target and keep only meaningfully relevant opportunities.
- Evaluate direct scope fit, mission or customer fit, delivery-model fit, classification overlap, contract type or funding instrument fit, eligibility fit, geography, due-date practicality, and alignment between the target profile and opportunity `descriptions` or `description` plus `govtribe_ai_summary`.
- Exclude opportunities that are too broad, too generic, too far outside the target’s delivery model, or supported only by weak keyword overlap.
9. Add validation branches only when they materially improve the answer.
- Use `Search_Federal_Contract_Awards` when incumbent, recompete, or prior contract-delivery evidence matters.
- Use `Search_Federal_Grant_Awards` when grant-history validation materially improves a grant ranking.
- For cohort-level validation, you may run an aggregation-only pass with `query: ""`, `search_mode: "keyword"`, `per_page: 0`, and the same structural filters to confirm the refined market shape.
- If the user requested a reusable saved search, preserve the final search persistence identifier and call `Create_Saved_Search` only after the final cohort is defined.
10. Rank the remaining opportunities using the fit labels below. Base the ranking on evidence, not intuition.
11. Perform a verification pass on the top-ranked opportunities.
- Remove weak, edge-case, or low-similarity matches and check whether the ordering still holds.
- Re-run the final filtered cohort rather than trusting the first page.
- Use additional pagination or a repeat aggregation check when needed to confirm coverage.
- If the ranking changes materially after cleanup, lower confidence and explain why.
### Fit Labels
- **Very High**: Strong direct fit to the target’s offering and customer mission, with no obvious eligibility or delivery-model mismatch.
- **High**: Clear relevance with multiple supporting signals, but not as direct as the strongest matches.
- **Medium**: Plausible and potentially worth watching, but one or more meaningful gaps remain.
- **Low**: Only partial or adjacent fit. Plausible, but weakly supported.
- **Exclude**: Clear mismatch in scope, customer, eligibility, delivery model, geography, timing, or textual evidence.
Scoring factors:
- Direct fit to the target’s core offering
- Alignment with the target’s delivery model
- Same or very similar customer mission
- Same or adjacent NAICS, PSC, or grant-program pattern
- Similar technical requirements
- Similar contract type, funding instrument, or vehicle pattern
- Eligibility fit, including set-aside or applicant-type alignment
- Geographic and operational fit
- Realistic pursuit timing based on due date and stage
- Consistency between the target profile and opportunity text or summary evidence
Ranking rules:
- Do not rank an opportunity above **Medium** unless there is strong direct scope fit and at least one additional supporting signal such as the same agency, same classification, same contract type, same operational environment, or strong text alignment.
- Do not include opportunities based mostly on keyword overlap.
## Tool Budget
Design the workflow to stay compact.
Typical path:
- 3 to 5 documentation calls
- 1 target-profile resolution call
- 0 to 3 ID-resolution or file-evidence calls
- 1 aggregation-first market-sizing pass
- 1 keyword and filter-first row pass
- 0 to 1 semantic pass
- 0 to 1 award-validation or saved-search handoff
Expected total:
- Typical: 7 to 10 calls
- High end with fallback: 11 to 13 calls
Budget rule:
- Avoid exceeding 15 calls unless an additional call materially changes correctness, completeness, or grounding.
## Output Format
Return the answer in this order:
1. **Target Profile Summary**
- Briefly summarize how the target company, solution, or capability profile was interpreted.
2. **Search Approach**
- Briefly explain which documentation articles were used.
- Briefly explain which `Search_*` tools, filters, and parameters were most important.
- Briefly explain how the aggregation-first pass, keyword and filter-first row pass, and semantic pass were used.
3. **Market Slice Summary**
- Start with a compact markdown table summarizing the dominant agencies, programs, classifications, set-aside patterns, or other market-shape signals in the scoped cohort.
- Briefly summarize whether the scoped market looked concentrated or fragmented.
- If concentration or distribution is central to the answer, you may add one small Mermaid `pie` or `xychart-beta` chart.
- Add the chart only when it materially improves interpretation, include a short explanation, and fall back to the compact table if the data is sparse or Mermaid is unavailable.
4. **Top Relevant Open Opportunities**
- Present this section as a compact markdown table first.
- Recommended columns: `Rank`, `Opportunity`, `Agency`, `Due Date`, `Fit`, `Why It Matters`.
- Keep the table compact and move overflow detail into short notes immediately below the table when needed.
5. **Why Others Were Excluded**
- Briefly note close-but-rejected or weak-fit opportunities.
6. **Saved Search / Alert Recommendation**
- If relevant, provide the recommended or created saved-search logic, filters, `frequency`, and `name`.
- Keep this section prose or a short field list.
7. **Overall Confidence**
- State overall confidence in the ranking and explain why.
## Citation Rules
- Only cite sources retrieved in the current workflow.
- Never fabricate citations, URLs, IDs, or quote spans.
- Use exactly the citation format required by the host application.
- Attach citations to the specific claims they support, not only at the end.
## Grounding Rules
- Base claims only on provided context or GovTribe MCP tool outputs.
- If sources conflict, state the conflict explicitly and attribute each side.
- If the context is insufficient or irrelevant, narrow the answer or state that the goal cannot be fully completed from the available evidence.
- If a statement is an inference rather than a directly supported fact, label it as an inference.Last updated
Was this helpful?
