Buyer Expansion Plan

Use this prompt when you already know the company you care about and the federal buyer office you want to expand into. It maps what that office has bought recently, how it tends to buy, which contacts are grounded in the record, and where the company has direct or adjacent evidence of fit. It is useful for account planning, buyer expansion, and turning office-level buying history into a concrete pursuit plan.

# Buyer Expansion Plan

## User Input
- **Target vendor:** [Vendor name, UEI, CAGE, GovTribe link, or company description]
- **Target buyer office:** [Federal agency, component, office, VISN, IC, command, acronym, or GovTribe link]

## Goal
Use GovTribe MCP tools to build a grounded **one-office federal buyer expansion plan** for a specific target vendor and a specific target buyer office.

Use recent federal contract award evidence as the primary source of truth for what the office buys, how it buys, who appears in contact signals, and whether the vendor has direct or adjacent evidence of fit. A complete answer should produce an office-specific plan with representative buys, evidence-backed contacts, ordering-path analysis, and a light micro-offer menu.

## Required Documentation
Before doing any work, call the **GovTribe Documentation** tool and read the documentation required for this workflow.

Required documentation to retrieve and read:
- `article_name="Search_Query_Guide"`
- `article_name="Search_Mode_Guide"`
- `article_name="Aggregation_and_Leaderboard_Guide"`
- `article_name="Date_Filtering_Guide"`
- `article_name="Search_Vendors_Tool"`
- `article_name="Search_Federal_Agencies_Tool"`
- `article_name="Search_Federal_Contract_Awards_Tool"`
- `article_name="Search_Contacts_Tool"`

Documentation rules:
- Call the **GovTribe Documentation** tool before the first research or search step.
- Read every required documentation article before using other GovTribe tools.
- Add `article_name="Search_Naics_Categories_Tool"` or `article_name="Search_Psc_Categories_Tool"` only when user-supplied filters or office-lane narrowing require canonical classification IDs.
- Add `article_name="Search_Federal_Contract_IDVs_Tool"` or `article_name="Search_Federal_Contract_Vehicles_Tool"` only when the ordering-path branch requires a dedicated IDV or vehicle lookup.
- Add `article_name="Search_Federal_Contract_Opportunities_Tool"` only when the optional current-opportunity overlay is needed.
- Add `article_name="Create_Saved_Search_Tool"` only when the user explicitly asks for a reusable saved search or alert.
- Treat the documentation as binding for tool names, parameters, filter names, `query`, `search_mode`, `fields_to_return`, `per_page`, aggregation options, and saved-search behavior.

## Required Input
The user must provide both of the following before analysis begins:

1. A **target vendor**
2. A **target buyer office**

Accept any of the following for the target vendor:
- Company name
- UEI
- CAGE
- GovTribe link
- Plain-language description only if it resolves to a single vendor without ambiguity

Accept any of the following for the target buyer office:
- Federal agency, component, office, VISN, IC, or command name
- Acronym
- GovTribe link
- Plain-language description only if it resolves to a single federal agency record without ambiguity

Optional constraints the user may provide:
- Time window for recent buying behavior
- Specific NAICS, PSC, or work category focus
- Whether to emphasize contracting-office behavior, funding-office behavior, or compare both
- Whether to include the optional current-opportunity overlay
- Whether to create a reusable saved search or alert
- Whether to focus on direct office wins only or also include adjacent vendor evidence

Input rules:
- If either the target vendor or target buyer office does not resolve cleanly to one entity, ask for the minimum missing detail required to proceed.
- If the user does not provide a time window, default to the last 24 months and say so explicitly.
- Do not guess the vendor, the office, or the office hierarchy.
- Do not start substantive analysis until both sides are resolved well enough to search.

## Workflow

### Rules
- Call `Documentation` before using any other GovTribe tool.
- Federal contract workflow only. Do not mix in grants or state and local procurement workflows.
- Always set both `search_mode` and `query` on every `Search_*` call.
- Use `query: ""` when structural filters define the cohort.
- Use `per_page: 0` only for aggregation-only cohorts.
- Use explicit `fields_to_return` whenever fields beyond `govtribe_id` are needed.
- Do not stop early when another tool call is required by the workflow.
- Keep calling tools until the task is complete or the tool budget is reached.
- If a tool returns empty or partial results and the workflow defines another defensible strategy, continue with that next strategy.
- Use recent office award evidence as the primary evidence surface for what the office buys and how it buys.
- Retrieve row-level award results before any semantic broadening or optional overlay branch.
- Do not require the vendor to already have wins in the target office. Adjacent evidence is allowed, but it must be labeled as adjacent evidence or inference.
- Do not present unsupported office familiarity, incumbent status, or contact access as fact.
- Keep the contact branch evidence-backed. Do not invent likely stakeholders, outreach names, or warm paths not supported by GovTribe records.
- Use the optional current-opportunity overlay and `Create_Saved_Search` only when they materially improve the office plan or the user explicitly asks for them.

### Steps
1. Call `Documentation` before any other GovTribe tool, read the required articles, and add only the optional documentation articles needed for the exact workflow path you will run.
   - Use the documentation results to confirm valid tool names, filter names, `fields_to_return`, `query`, `search_mode`, aggregation options, and saved-search requirements before searching.
2. Resolve the target vendor with `Search_Vendors`.
   - Use `search_mode: "keyword"` with a quoted `query` for exact names, UEIs, CAGEs, or GovTribe links.
   - Request `fields_to_return` explicitly. At minimum request `govtribe_id`, `govtribe_url`, `name`, `uei`, `dba`, `business_types`, `sba_certifications`, `parent_or_child`, `parent`, `naics_category`, and `govtribe_ai_summary`.
   - Normalize the exact vendor identity you will analyze. Do not silently widen scope to parent or child entities.
3. Resolve the target buyer office with `Search_Federal_Agencies`.
   - Use `search_mode: "keyword"` with a quoted `query` for exact names, acronyms, or GovTribe links.
   - Request `fields_to_return` explicitly. At minimum request `govtribe_id`, `govtribe_url`, `name`, `alternate_name`, `acronym`, and `defense_or_civilian`.
   - If the office label does not resolve to one federal agency record, stop and ask for the minimum clarification needed.
4. Normalize the planning window and any optional narrowing filters.
   - Default to the last 24 months when the user does not provide a time window.
   - Resolve NAICS and PSC IDs only when user-supplied filters or office-lane narrowing make that necessary.
   - Decide whether the office should be analyzed primarily as a `contracting_federal_agency`, a `funding_federal_agency`, or both.
   - If the user did not explicitly specify the office role, run a quick evidence check for both roles with the same planning window before committing to one. Do not choose the role based only on domain knowledge or agency familiarity.
   - If one role is sparse, test the other role and state which role is actually supported by the returned evidence.
5. Run an aggregation-first office buying pass with `Search_Federal_Contract_Awards`.
   - Use `query: ""`, `search_mode: "keyword"`, and `per_page: 0`.
   - Start with the resolved office as the strongest structural filter through `contracting_federal_agency_ids`, `funding_federal_agency_ids`, or both, based on the role decision from Step 4.
   - Keep the time window stable.
   - Add resolved `naics_category_ids` or `psc_category_ids` only when the user explicitly narrowed the office plan to those lanes.
   - Request aggregations that support buyer planning. At minimum use `dollars_obligated_stats`, `top_awardees_by_dollars_obligated`, `top_naics_codes_by_dollars_obligated`, `top_psc_codes_by_dollars_obligated`, `top_federal_contract_vehicles_by_dollars_obligated`, `top_idvs_by_dollars_obligated`, and `top_transaction_points_of_contact_by_dollars_obligated`.
   - Use this pass to identify dominant work categories, leading awardees, typical value bands, concentration, and whether the office appears to buy mainly through stand-alone awards, IDVs, or master vehicles.
6. Run a row-level office award retrieval pass with `Search_Federal_Contract_Awards` before any optional branch.
   - Reuse the same stable office and time-window filters from Step 5.
   - Use `query: ""` and `search_mode: "keyword"`.
   - Paginate as needed rather than assuming the first page is enough.
   - If the first page is dominated by very large outlier awards, off-lane buys, or records that are not representative of the vendor's plausible entry lane, run at least one additional row pass using evidenced lane filters or a more representative sort before selecting the office's representative buys.
   - Request `fields_to_return` explicitly. At minimum request `govtribe_id`, `govtribe_url`, `name`, `contract_number`, `award_date`, `completion_date`, `ultimate_completion_date`, `contract_type`, `descriptions`, `govtribe_ai_summary`, `dollars_obligated`, `ceiling_value`, `set_aside_type`, `awardee`, `parent_of_awardee`, `contracting_federal_agency`, `funding_federal_agency`, `naics_category`, `psc_category`, `federal_contract_vehicle`, `federal_contract_idv`, `transaction_contacts`, and `originating_federal_contract_opportunity`.
   - Use these rows to select representative recent buys and confirm the office role, ordering-path pattern, and repeat-awardee behavior.
7. Run a vendor-fit and access pass with `Search_Federal_Contract_Awards`.
   - Use the resolved `vendor_ids` for the target vendor.
   - First check for direct office overlap in the same planning window using the same office filters from Step 5 when that is supported by the office-role decision.
   - If direct office evidence is sparse or absent, run one careful adjacent-evidence pass that compares the vendor against the office's dominant NAICS, PSC, value band, and recurring vehicle or IDV patterns.
   - Keep direct evidence separate from adjacent evidence. Label adjacent evidence as adjacent or inferred rather than presenting it as direct office familiarity.
   - Use this pass to decide whether the vendor appears to have direct office access, partial ordering-path access, only adjacent lane evidence, or no meaningful evidence yet.
8. Run the ordering-path branch only when the office award evidence shows meaningful IDV or vehicle concentration.
   - First use `federal_contract_vehicle` and `federal_contract_idv` already returned on award rows.
   - Only call `Search_Federal_Contract_IDVs` or `Search_Federal_Contract_Vehicles` when a dedicated follow-on lookup materially improves correctness.
   - For IDV follow-on calls, seed the search from IDs already returned on award rows whenever possible and request fields such as `govtribe_id`, `govtribe_url`, `name`, `contract_number`, `award_date`, `last_date_to_order`, `contract_type`, `description`, `govtribe_ai_summary`, `ceiling_value`, `multiple_or_single_award`, `set_aside`, `awardee`, `parent_of_awardee`, `federal_contract_vehicle`, `contracting_federal_agency`, `funding_federal_agency`, `naics_category`, `psc_category`, `task_orders`, `transaction_contacts`, and `originating_federal_contract_opportunity`.
   - For vehicle follow-on calls, seed the search from IDs already returned on award rows whenever possible and request fields such as `govtribe_id`, `govtribe_url`, `name`, `award_date`, `last_date_to_order`, `contract_type`, `descriptions`, `govtribe_ai_summary`, `set_aside_type`, `shared_ceiling`, `federal_agency`, `federal_contract_awards`, and `originating_federal_contract_opportunity`.
   - Use this branch to explain how the office routes work, whether the lane is stand-alone or vehicle-mediated, and whether the vendor already holds a plausible access path.
9. Run the evidence-backed contact branch with `Search_Contacts`.
   - Use the resolved office through `federal_agency_ids`.
   - Prefer `query: ""` and `search_mode: "keyword"` when the office filter and reference filters define the contact cohort.
   - Prefer `reference_types` of `pointOfContact` and `transactionContact`.
   - When representative awards from Step 6 materially sharpen the contact set, add `referenced_govtribe_ids` from those awards.
   - If the office-wide contact cohort is broad, noisy, or dominated by contacts outside the vendor-relevant lane, narrowing with `referenced_govtribe_ids` from the most representative awards is required.
   - Request `fields_to_return` explicitly. At minimum request `govtribe_id`, `govtribe_url`, `name`, `email`, `phone`, `title`, `role`, `organization`, and `parent_organization_details`.
   - If this branch returns no usable contacts, say so clearly. Do not invent broader outreach targets.
10. Use the current-opportunity overlay only when the user explicitly asks for it or when present-tense demand materially improves the office plan.
    - Read `Search_Federal_Contract_Opportunities_Tool` before running this branch.
    - Use `Search_Federal_Contract_Opportunities` with the resolved office and the strongest office-lane filters.
    - Use `query: ""` and `search_mode: "keyword"` when structural filters define the cohort.
    - Request `fields_to_return` explicitly. At minimum request `govtribe_id`, `govtribe_url`, `name`, `solicitation_number`, `opportunity_type`, `set_aside_type`, `posted_date`, `due_date`, `descriptions`, `govtribe_ai_summary`, `federal_meta_opportunity_id`, `federal_contract_vehicle`, `federal_agency`, `naics_category`, `psc_category`, and `points_of_contact`.
    - Use this branch only to validate whether active demand resembles the historical buying pattern and to support optional saved-search creation.
11. Create a saved search only when the user explicitly asks for a reusable search or alert.
    - Read `Create_Saved_Search_Tool` before running this branch.
    - Use a final opportunity-search result set as the persistence source. Do not create a saved search from the award-analysis result set.
    - Reuse the returned `search_results_id` from the final opportunity cohort, then call `Create_Saved_Search` with a user-appropriate `name` and `frequency`.
12. Perform a verification pass before finalizing the office plan.
    - Remove obvious outlier awards or weak adjacent-evidence claims and confirm the main office-buying pattern still holds.
    - Lower confidence if the plan depends on sparse office history, one-off vehicle signals, or one-record contact evidence.
    - If evidence is too thin to support a credible office plan, say so clearly and stop.

## Tool Budget
Design the workflow to stay compact.

Typical path:
- 8 required documentation calls
- 0 to 5 additional documentation calls for optional narrowing, ordering-path, opportunity, or saved-search branches
- 1 vendor-resolution call
- 1 office-resolution call
- 1 office aggregation pass
- 1 office row-retrieval pass
- 1 vendor-fit pass
- 1 contact pass
- 0 to 1 ordering-path follow-on
- 0 to 1 opportunity overlay
- 0 to 1 saved-search creation
- 0 to 1 verification pass

Expected total:
- Typical: 13 to 15 calls
- High end with ordering-path, opportunity, and saved-search branches: 16 to 18 calls

Budget rule:
- Avoid exceeding 18 calls unless an additional call materially changes correctness, completeness, or grounding.

## Output Format
Return the answer in this order:

1. **Vendor and Office Resolution**
   - Briefly explain how the target vendor and target buyer office were resolved.
   - State the planning window used.
   - State whether the office was analyzed as a contracting office, a funding office, or both.
2. **Search Approach**
   - Briefly explain which `Documentation.article_name` calls were used.
   - Briefly explain which GovTribe tools were used.
   - Briefly explain how the office aggregation pass, office row pass, vendor-fit pass, contact pass, and any optional ordering-path or opportunity branch were used.
3. **Office Buying Profile**
   - Use a required markdown table.
   - Recommended columns: `Dimension`, `Finding`, `Evidence`.
   - Cover dominant work categories, leading awardees, value-band pattern, concentration, recurring ordering paths, and any important caveats.
4. **Representative Recent Buys**
   - Use a compact markdown table.
   - Recommended columns: `Award`, `Awardee`, `Date`, `Value`, `Structure`, `Why It Matters`.
5. **Evidence-Backed Contacts**
   - Use a compact markdown table.
   - Recommended columns: `Contact`, `Organization`, `Role / Signal`, `Why Relevant`.
   - If no evidence-backed contacts are available, say so clearly instead of fabricating a contact list.
6. **Ordering Path and Access Assessment**
   - Explain whether the office appears stand-alone, IDV-led, vehicle-led, or mixed.
   - Explain whether the vendor shows `Direct`, `Adjacent`, `Inferred`, or `Weak` access evidence and why.
   - Label inferences clearly.
7. **Office-Specific Pursuit Angles / Light Micro-Offer Menu**
   - Provide 3 to 5 concise pursuit angles or micro-offer themes.
   - Tie each one directly to observed buying evidence, work category, value band, or ordering path.
   - Do not turn this section into a full offer packet, CLIN build, or long outreach draft.
8. **Recommended Next Actions**
   - Provide concise next steps for the vendor's buyer expansion plan.
   - Include saved-search or alert recommendations only when that branch was requested or run.
9. **Overall Confidence**
   - State overall confidence and briefly explain the main supporting evidence and limitations.

## Citation Rules
- Only cite sources retrieved in the current workflow.
- Never fabricate citations, URLs, IDs, or quote spans.
- Use exactly the citation format required by the host application.
- Attach citations to the specific claims they support, not only at the end.

## Grounding Rules
- Base claims only on provided context or GovTribe MCP tool outputs.
- If sources conflict, state the conflict explicitly and attribute each side.
- If the context is insufficient or irrelevant, narrow the answer or state that the goal cannot be fully completed from the available evidence.
- If a statement is an inference rather than a directly supported fact, label it as an inference.

Last updated

Was this helpful?