Search documentation

Search for pages in the documentation

How It Works

Understanding the Deal Pulse scoring mechanism, categories, and activity tracking

Deal Pulse calculates a 0-100 score that measures the health of your deal based on engagement and activity patterns. This page explains how the scoring works.

The Big Picture

Your Deal Pulse score combines five different categories, each measuring a different aspect of deal health:

text
Overall Score =
  Engagement (50%) +
  Collaboration (25%) +
  Diversity (10%) +
  Organization (10%) +
  Communication (5%)

What this means:

  • Half your score comes from Engagement (meetings and buyer activity)
  • A quarter comes from Collaboration (mutual planning)
  • The remaining quarter split between Diversity, Organization, and Communication

Category Weights

The five categories aren't equal - some matter more than others:

CategoryWeightWhy It Matters
Engagement50%Meeting activity and buyer interaction are the strongest signs of deal health
Collaboration25%Mutual planning shows both sides are committed to moving forward
Diversity10%Multiple stakeholders reduce the risk of deal falling through
Organization10%Structured approach indicates serious evaluation
Communication5%Dialogue matters, but less than actual meetings and progress

Total: 100%

What this tells you:

  • Focus on Engagement first (half your score)
  • Collaboration second (quarter of your score)
  • Other categories matter but have less impact

Recent Activity Matters More

Deal Pulse values recent activity over old activity. Activity from last week counts more than activity from last month.

Time Windows

text
Last 7 days:    Full credit (100%)
Last 30 days:   Reduced credit (85%)
Last 90 days:   Lower credit (70%)
Older than 90 days: No credit (0%)

Example:

  • Comment from yesterday: counts fully
  • Comment from 3 weeks ago: counts for 85%
  • Comment from 2 months ago: counts for 70%
  • Comment from 4 months ago: doesn't count

Why? Deal health changes. Heavy activity 3 months ago doesn't mean the deal is active today.

Meetings Get Special Treatment

Meetings decay slower than other activities because they happen less frequently:

text
Regular activities decay: 100% → 85% → 70%
Meetings decay: 100% → 92.5% → 85%

Why? A monthly meeting cadence is normal. We don't want to over-penalize deals with less frequent (but still healthy) meeting schedules.

Practical impact:

  • Your score won't drop dramatically between monthly meetings
  • Enterprise deals with longer cycles aren't unfairly penalized
  • Quarterly business reviews still contribute to your score

Buyer Activity Counts More

Actions by buyers count 1.5x more than actions by sellers.

Where this applies:

  • Buyer comments vs seller comments
  • Buyer first-time actions (first view, first contact added)

Example:

text
Seller adds 10 comments: 10 points
Buyer adds 10 comments: 15 points

Why? Seller activity is expected - you're always going to be active. Buyer activity shows genuine interest and engagement.


Activity Gets Capped

The system caps activity counts to prevent extreme outliers from distorting scores.

Caps:

  • Buyer logins: capped at 5 per buyer
  • Comments/tasks: capped at 20
  • Contacts: capped at 5
  • Meetings: capped at 3

What this means:

  • Having 3 meetings gives you full meeting credit
  • Having 10 meetings doesn't give you 3x the score
  • Quality over quantity

Why? Three meaningful meetings is better than 10 quick check-ins. System rewards normal, healthy activity patterns rather than excessive activity.


Score States

Your overall score (0-100) maps to one of four states:

Score RangeStateWhat It Means
50-100ON_TRACKHealthy engagement, deal progressing
25-49AT_RISKLow activity, needs attention
5-24OFF_TRACKVery low activity, likely stalled
0-4INACTIVEEssentially no activity

These are labels, not calculations - they help you interpret your score quickly.


How Scores Update

Timeline:

Today:

text
You add comments, schedule meetings, update milestones
Activity is recorded immediately

Tonight (~1 AM):

text
System aggregates all activity
Counts meetings, comments, sessions, etc.

Tonight (~2-3 AM):

text
Deal Pulse calculates new scores
Applies weights and time decay
Determines score state

Tomorrow morning:

text
New score appears in UI
You see the updated number

Important: Today's activity won't affect today's score. Scores update overnight, so changes appear the next morning.


What's NOT Considered

Deal Pulse doesn't see:

External activities:

  • Email exchanges
  • Phone calls
  • Slack or Teams messages
  • In-person meetings (unless logged in platform)
  • CRM activities

Contextual information:

  • Deal size or value
  • Company size or industry
  • CRM stage or forecast category
  • Budget approval status
  • Competitive situation

Content details:

  • What your messages say
  • Tone or sentiment
  • What documents contain
  • Meeting agendas or outcomes

Why not? Deal Pulse focuses on measurable engagement patterns inside the Decision Site. This keeps scoring objective, consistent, and fair across all deals.


Algorithm Version

Current version: vibe-clso4 v1.1.0

What this means:

  • The algorithm has a version number
  • If we improve the scoring system, the version updates
  • Your historical scores show which version calculated them
  • Helps explain score changes over time

Recent changes (v1.1.0):

  • Softer time decay for meetings
  • Better handling of past vs future meetings

Configuration

Deal Pulse configuration is standardized - same for all organizations.

Why?

  • Ensures consistency across all users
  • Allows comparison between deals
  • Based on analysis of what predicts deal success
  • No configuration burden on admins

Future: We may add custom weighting for specific industries or use cases, but for now everyone uses the same algorithm.


How This Helps You

Use Deal Pulse to:

  1. Spot deals going cold - declining score means losing momentum
  2. Identify weak areas - check category breakdown to see what's missing
  3. Prioritize time - focus on deals with good scores but specific weaknesses
  4. Course correct - see which actions will boost your score most

Don't use Deal Pulse for:

  1. Predicting close probability - score measures engagement, not outcome
  2. Replacing judgment - combine with CRM, pipeline value, and your instincts
  3. Gaming the system - fake activity doesn't help you close deals


Category Breakdown

Deal Pulse combines five categories into an overall score. Each category measures a different aspect of deal health. This page explains what each category tracks and how to interpret the scores.

The Five Categories

text
Overall Score =
  Engagement (50%) +
  Collaboration (25%) +
  Diversity (10%) +
  Organization (10%) +
  Communication (5%)

1. Engagement (50% Weight)

What it measures: Meeting activity, buyer logins, and how recently you've been active

Why it's weighted highest: Engagement is the strongest predictor of deal health. Active deals have meetings, buyers participating, and recent activity.

What Goes Into Engagement

Engagement combines four things:

ComponentImportanceWhat It Tracks
Meetings70%Past and upcoming meetings
Buyer Logins15%How often buyers access Decision Site
Recency10%Days since last activity
Buyer Milestones5%First buyer view, first contact add

How Meetings Work

Meetings are the biggest driver of Engagement (70% of the category).

What counts:

  • Past meetings that occurred
  • Future meetings scheduled
  • Either past OR future can drive the score (whichever is higher)

Special rules:

  • Future meetings count for 80% of what past meetings count for
  • More recent meetings count more than older ones
  • Meetings 3+ months ago don't count

Example of a strong meeting score:

text
Had 2 meetings last month
Have 1 meeting scheduled next week
Last meeting was 10 days ago

Example of a weak meeting score:

text
Last meeting was 60 days ago
No upcoming meetings scheduled
Haven't met in over 2 months

How Buyer Logins Work

System tracks how often buyers log in to view the Decision Site.

What counts:

  • Unique browser sessions per buyer
  • Average across all buyers
  • More recent logins count more

Example of strong buyer login activity:

text
Buyers logging in weekly
2-3 sessions per buyer in last week
Regular engagement with content

Example of weak buyer login activity:

text
Buyers haven't logged in for weeks
Only 1 login in last month
No recent engagement

How Recency Works

How recently was there ANY activity?

Scoring:

  • Activity today: maximum points
  • Activity 15 days ago: half points
  • Activity 30+ days ago: zero points

What counts as activity:

  • Any action in Decision Site
  • Meeting, comment, milestone update, etc.
  • Buyer or seller activity

Example:

text
Strong recency: Last activity yesterday
Moderate recency: Last activity 2 weeks ago
Weak recency: Last activity over a month ago

Buyer Milestones

First-time buyer actions get bonus points:

  • First buyer view: When a buyer first logs into Decision Site
  • First buyer adds contact: When a buyer invites another stakeholder

These are one-time bonuses - they stick with you permanently once achieved.

Why these matter: Shows buyer taking ownership and expanding the buying committee.

Strong vs Weak Engagement

Strong Engagement (70-100):

text
✓ 2-3 meetings per month
✓ Buyers logging in weekly
✓ Activity within last 7 days
✓ Meetings scheduled for next month
✓ Buyers invited other stakeholders

Weak Engagement (0-39):

text
✗ No meetings in 60+ days
✗ Buyers haven't logged in for weeks
✗ Last activity 30+ days ago
✗ No upcoming meetings scheduled
✗ No buyer-initiated actions

2. Collaboration (25% Weight)

What it measures: Mutual action plan activity - milestones, tasks, and templates

Why it matters: Collaboration shows both sides are committed to moving the deal forward together.

What Goes Into Collaboration

ComponentImportanceWhat It Tracks
Milestones40%Adding, updating, completing milestones
Action Items40%Adding, completing tasks
Templates10%Using mutual plan templates
Plan Started10%Whether you've started a mutual plan

How Milestones Work

All milestone actions count:

  • Adding new milestones
  • Updating milestone progress
  • Completing milestones

What matters:

  • Regular milestone activity (not one-time bursts)
  • Both adding AND completing (shows progress)
  • Recent updates (old milestones decay)

Example of strong milestone activity:

text
5-10 active milestones
Milestones updated weekly
1-2 milestones completed per month
New milestones added as deal progresses

Example of weak milestone activity:

text
No milestones created
Milestones created but never updated
No milestones completed
All activity 2+ months ago

How Action Items Work

Creating and completing tasks shows execution:

What counts:

  • Adding new action items
  • Completing action items
  • (Updates to action items don't count separately)

What matters:

  • Both creating AND completing tasks
  • Regular flow of new tasks
  • Balance between buyer and seller tasks

Example of strong action item activity:

text
10-15 active tasks
New tasks created weekly
Tasks completed regularly
Both buyer and seller completing their tasks

Template Usage

Applying a mutual plan template gives bonus points.

Why? Using templates shows:

  • Process maturity
  • Structured approach
  • Professional sales practice

Note: This is a small bonus, not a major driver of score.

Mutual Plan Started

One-time bonus for starting a mutual plan (adding first milestone or action item).

Why? Starting a mutual plan is a commitment signal. It shows you're moving beyond just talking to actually planning together.

Strong vs Weak Collaboration

Strong Collaboration (70-100):

text
✓ Active mutual plan with 5+ milestones
✓ Milestones being updated weekly
✓ Action items created and completed regularly
✓ Both sides executing on their commitments
✓ Used a template to structure the plan

Weak Collaboration (0-39):

text
✗ No mutual plan exists
✗ Milestones created but abandoned
✗ No action items or all incomplete
✗ Only seller activity, no buyer participation
✗ No structure or process

3. Diversity (10% Weight)

What it measures: Stakeholder breadth - how many people and organizations are involved

Why it matters: Multiple stakeholders across departments reduces single-point-of-failure risk.

What Goes Into Diversity

ComponentImportanceWhat It Tracks
Buyer Count30%Contacts with buyer roles
Email Domains25%Different companies represented
Total Contacts20%Overall stakeholder count
Departments12.5%Different departments involved
Job Titles12.5%Different roles represented

What These Mean

Buyer count:

  • How many buyer-role contacts you have
  • Decision makers, influencers, champions, etc.
  • More buyers = more support

Email domains:

  • Different companies involved (@company1.com, @company2.com)
  • Shows broader organizational buy-in
  • Multiple divisions or parent/subsidiary

Total contacts:

  • Everyone involved (buyers and sellers)
  • Shows breadth of engagement
  • More contacts = more touch points

Departments:

  • IT, Finance, Operations, Sales, etc.
  • Cross-functional involvement
  • Different departments = different perspectives represented

Job titles:

  • VP, Director, Manager, etc.
  • Different levels and functions
  • Title diversity = comprehensive stakeholder map

Strong vs Weak Diversity

Strong Diversity (70-100):

text
✓ 5+ contacts from buyer side
✓ 2-3 different companies/divisions involved
✓ IT, Finance, and Operations departments
✓ Mix of decision maker, technical evaluator, executive sponsor
✓ Multiple levels (VP, Director, Manager)

Weak Diversity (0-39):

text
✗ 1-2 contacts total
✗ Single company/division
✗ Only one department
✗ All same role type
✗ Single-threaded through one champion

Why this matters: If your one champion leaves, changes roles, or loses interest, the deal dies. Multiple stakeholders provide resilience.


4. Organization (10% Weight)

What it measures: Task organization, content sharing, and balanced participation

Why it matters: Structured deals with clear ownership move forward more reliably.

What Goes Into Organization

ComponentImportanceWhat It Tracks
Task Organization60%Action items with assignee or due date
Artifacts20%Whether documents have been shared
Balanced Completion20%Both buyer and seller completing tasks

Task Organization

System checks: do your action items have assignees or due dates?

Scoring:

  • Percentage of action items that are properly organized
  • Either assignee OR due date counts
  • Both is even better

Example:

text
Strong: 90% of tasks have assignee or due date
Weak: 30% of tasks have assignee or due date

Why it matters: Unassigned tasks with no due dates tend not to get done.

Artifacts

One-time bonus for uploading content:

What counts:

  • Proposals
  • Case studies
  • Technical documentation
  • ROI calculators
  • Demo recordings
  • Any shared content

Why it matters: Sharing content shows preparation and value delivery.

Balanced Completion

Are both buyers AND sellers completing their tasks?

Scoring:

  • Bonus points if both sides have completions
  • One-sided completion (only seller or only buyer) gets zero bonus
  • Encourages true collaboration

Example:

text
Strong: Buyers completing 40% of tasks, sellers 60%
Weak: Sellers completing 100% of tasks, buyers 0%

Why it matters: Mutual plan only works if both sides execute.

Strong vs Weak Organization

Strong Organization (70-100):

text
✓ 85%+ of action items have assignee or due date
✓ Shared multiple artifacts (proposal, case studies, docs)
✓ Buyers and sellers both completing their assigned tasks
✓ Clear ownership and accountability

Weak Organization (0-39):

text
✗ Most tasks unassigned with no due dates
✗ No content shared
✗ Only sellers completing tasks (one-sided)
✗ No structure or accountability

5. Communication (5% Weight)

What it measures: Comments and dialogue between buyers and sellers

Why it's weighted lowest: Communication is important but less critical than meetings, collaboration, and progress.

What Goes Into Communication

ComponentImportanceWhat It Tracks
Buyer Comments40%Comments from buyers (1.5x weight)
Seller Comments40%Comments from sellers
Dialogue Balance20%Both sides participating equally

Buyer vs Seller Comments

Buyer comments worth more:

  • Buyer comment gets 1.5x weight
  • Seller comment gets 1.0x weight
  • Encourages getting buyers to engage

What counts:

  • Comments on milestones
  • Comments on artifacts
  • Comments on action items
  • General discussion

Dialogue Balance

Bonus for balanced participation:

Best: Both buyer and seller commenting roughly equally Okay: One side heavier than the other Weak: Only one side commenting (monologue)

Example:

text
Strong balance: Buyer 10 comments, Seller 12 comments
Weak balance: Buyer 0 comments, Seller 20 comments

Strong vs Weak Communication

Strong Communication (70-100):

text
✓ Buyers commenting on milestones and artifacts
✓ Sellers responding to questions
✓ Regular back-and-forth dialogue
✓ Both sides roughly equal in participation

Weak Communication (0-39):

text
✗ Only seller commenting (monologue)
✗ Buyers never commenting
✗ No dialogue or back-and-forth
✗ Long gaps between comments

Why it matters (but not as much): Comments show engagement, but actual meetings and progress matter more. You can have great comments but no meetings, which isn't a healthy deal.


Category Score Ranges

What Different Scores Mean

ScoreExcellent (80+)Good (60-79)Fair (40-59)Weak (0-39)
Engagement3+ meetings/month, daily logins2 meetings/month, weekly logins1 meeting/month, occasional loginsNo meetings, rare logins
CollaborationActive plan, regular completionsPlan exists, some activityPlan started, minimal activityNo mutual plan
Diversity5+ contacts, multiple departments3-4 contacts, 2 departments2-3 contacts, 1-2 departments1-2 contacts, single dept
OrganizationAll tasks assigned with datesMost tasks organizedSome tasks organizedFew/no tasks organized
CommunicationRegular comments from both sidesOccasional commentsRare commentsNo comments

How to Use Category Breakdown

Don't just look at overall score - check which categories are weak.

Example 1: Low Engagement

text
Overall: 42 (AT_RISK)

Categories:
- Engagement: 25 (meetings stopped)
- Collaboration: 55 (mutual plan exists)
- Diversity: 60 (good stakeholders)
- Organization: 45 (average)
- Communication: 35 (minimal)

Diagnosis: Meetings stopped, need to re-engage
Action: Schedule next meeting

Example 2: Low Collaboration

text
Overall: 48 (AT_RISK)

Categories:
- Engagement: 70 (good meetings)
- Collaboration: 15 (no mutual plan)
- Diversity: 55 (decent stakeholders)
- Organization: 50 (average)
- Communication: 40 (some comments)

Diagnosis: Meetings happening but no joint commitment
Action: Create mutual action plan

Example 3: Narrow Stakeholders

text
Overall: 51 (ON_TRACK but barely)

Categories:
- Engagement: 65 (regular meetings)
- Collaboration: 60 (active plan)
- Diversity: 20 (single-threaded)
- Organization: 55 (organized)
- Communication: 50 (decent)

Diagnosis: Single point of failure (one champion)
Action: Expand to other stakeholders


Activity Tracking

Deal Pulse scores are based on activities that happen in your Decision Site. This page explains what activities are tracked, how they're used in scoring, and what's NOT tracked.

How Activity Tracking Works

Simple flow:

text
You take an action → System records it → Overnight, data is analyzed → Tomorrow, score updates

Timeline:

  1. Today: You schedule a meeting, add a comment, update a milestone
  2. Tonight (~1 AM): System counts all activity
  3. Tonight (~2-3 AM): Deal Pulse calculates new scores
  4. Tomorrow morning: Updated score appears

Important: Today's activity won't affect today's score - it appears tomorrow.


What Gets Tracked

Meetings

What counts:

  • Past meetings that occurred
  • Future meetings scheduled on calendar
  • Meetings linked to your Decision Site

How it's used:

  • Primary driver of Engagement (70% of Engagement category)
  • Recent meetings count more than older meetings
  • Future meetings count for 80% of past meetings

What matters:

  • Meeting frequency (2-3 per month is ideal)
  • Recent meetings (last 7-30 days)
  • Upcoming meetings (next 7-30 days)

Doesn't count:

  • Meetings not linked to Decision Site
  • External calendar events not imported
  • Phone calls or video calls not logged as meetings

Buyer Logins

What counts:

  • Buyer viewing Decision Site
  • Each browser session per buyer
  • Averaged across all buyers

How it's used:

  • Part of Engagement category (15% of Engagement)
  • Shows buyer interest and participation
  • Recent logins count more than older ones

What matters:

  • How often buyers log in (weekly is ideal)
  • Number of different buyers logging in
  • Recency of logins

Note: A "session" means a buyer opens Decision Site in their browser. Multiple page views in one visit = one session.


Milestones

What counts:

  • Adding a new milestone
  • Updating milestone progress or description
  • Completing a milestone

How it's used:

  • Major part of Collaboration category (40% of Collaboration)
  • All milestone actions count
  • Recent activity counts more

What matters:

  • Having 5-10 active milestones
  • Regular updates (weekly)
  • Completing milestones as you progress
  • Both adding AND completing (not just creating)

Doesn't count:

  • Deleting milestones
  • Just viewing milestones (no action taken)

Action Items (Tasks)

What counts:

  • Adding a new action item
  • Completing an action item

How it's used:

  • Major part of Collaboration category (40% of Collaboration)
  • Also affects Organization category (task assignment, balance)
  • Recent activity counts more

What matters:

  • Creating new tasks regularly
  • Completing tasks (both buyer and seller)
  • Having assignees and due dates set
  • Balance between buyer and seller completions

Doesn't count:

  • Updating action items (only add and complete count separately)
  • Deleting action items

Templates

What counts:

  • Applying a mutual plan template to your Decision Site

How it's used:

  • Small part of Collaboration category (10% of Collaboration)
  • Shows process maturity

What matters:

  • Using at least one template
  • Customizing template to your deal

Note: This is a small bonus for using structured approaches.


Contacts

What counts:

  • Total number of contacts
  • Number of buyer-role contacts
  • Number of unique email domains
  • Number of unique departments
  • Number of unique job titles

How it's used:

  • Entire Diversity category (all components)
  • Shows stakeholder breadth

What matters:

  • Having 5+ total contacts
  • Having 2-3 buyer-role contacts
  • Multiple departments represented
  • Cross-functional involvement

Special bonus:

  • First time a buyer adds a contact (one-time Engagement bonus)

Comments

What counts:

  • Adding a comment on any item (milestone, artifact, action item)
  • Separated by buyer vs seller

How it's used:

  • Entire Communication category
  • Buyer comments worth 1.5x seller comments
  • Balance between buyer and seller comments matters

What matters:

  • Both buyers and sellers commenting
  • Regular back-and-forth dialogue
  • Recent comments count more

Doesn't count:

  • Editing or deleting comments
  • Just viewing comments

Artifacts (Documents)

What counts:

  • Uploading an artifact (first time only for scoring bonus)
  • Viewing artifacts (tracked but minimal impact)

How it's used:

  • One-time bonus in Organization category (20% of Organization)
  • Shows content sharing and preparation

What matters:

  • Sharing at least one artifact
  • Type of content doesn't matter (all artifacts count equally)

Examples:

  • Proposals
  • Case studies
  • Technical documentation
  • Demo recordings
  • ROI calculators

Task Organization

What counts:

  • Percentage of action items that have either:
    • An assignee assigned, OR
    • A due date set

How it's used:

  • Major part of Organization category (60% of Organization)
  • Direct percentage becomes the score

What matters:

  • Assigning tasks to specific people
  • Setting due dates
  • Having 80%+ of tasks organized

Example:

text
10 total action items:
- 8 have assignee or due date → 80% score
- 2 have neither → hurts score

First-Time Actions

Special one-time bonuses for:

First buyer view:

  • When a buyer first logs into Decision Site
  • Bonus: added to Engagement category
  • Only counts once, ever

First buyer adds contact:

  • When a buyer first invites another stakeholder
  • Bonus: added to Engagement category
  • Only counts once, ever

Mutual plan started:

  • First time any mutual plan activity occurs
  • Bonus: added to Collaboration category
  • Only counts once, ever

First artifact upload:

  • First time any artifact is uploaded
  • Bonus: added to Organization category
  • Only counts once, ever

These bonuses stick with you permanently - once achieved, they don't decay.


Time Windows

All activity is counted in three rolling time windows:

Recent (7 days):

  • Last/next 7 days
  • Full credit (100%)
  • Most important window

Medium (30 days):

  • Last/next 30 days
  • Reduced credit (85%)
  • Moderately important

Distant (90 days):

  • Last/next 90 days
  • Lower credit (70%)
  • Least important

Older than 90 days:

  • Not counted (0%)
  • Completely excluded from scoring

Example:

text
Comment from 5 days ago: in "recent" window (100% credit)
Comment from 20 days ago: in "medium" window (85% credit)
Comment from 60 days ago: in "distant" window (70% credit)
Comment from 120 days ago: not counted (0% credit)

What's NOT Tracked

External activities:

  • Email exchanges
  • Phone calls not logged as meetings
  • Slack or Teams messages
  • Text messages
  • In-person conversations
  • CRM activities (logging notes, updating stages)

Why not? Deal Pulse only tracks activity inside the Decision Site to ensure measurable, consistent scoring.

Platform activity:

  • Passive viewing (looking at pages without taking action)
  • Navigation clicks
  • Time spent on pages
  • Downloads (unless it's tracked as artifact view)

Content details:

  • What your comments say
  • Sentiment or tone
  • What documents contain
  • File sizes or types
  • Meeting agendas or outcomes

Why not? System counts activities, not content. Keeps scoring objective.

Contextual information:

  • Deal size or value
  • CRM stage
  • Forecast category
  • Close date
  • Win probability
  • Competitive situation

Why not? Deal Pulse measures engagement patterns, not business context.


Activity vs Score Impact

High-impact activities (do these for biggest score boost):

  1. Schedule meetings → drives Engagement (50% of overall)
  2. Create mutual plan → drives Collaboration (25% of overall)
  3. Add milestones and tasks → drives Collaboration
  4. Get buyers to log in → drives Engagement (1.5x multiplier)
  5. Add stakeholders → drives Diversity (10% of overall)

Medium-impact activities:

  1. Complete milestones/tasks → drives Collaboration
  2. Assign tasks and set due dates → drives Organization (10% of overall)
  3. Get buyers to comment → drives Communication (5% of overall, 1.5x multiplier)

Low-impact activities (nice to have but won't move score much):

  1. Seller comments → Communication (5% of overall, normal weight)
  2. Upload artifacts → Organization (one-time bonus)
  3. Apply templates → Collaboration (small bonus)

Common Misunderstandings

"We had 5 phone calls but score didn't increase"

Issue: Phone calls aren't logged as meetings in Decision Site.

Solution: Log external meetings in the platform or schedule them through the platform's calendar integration.

"Buyer is very engaged via email but score is low"

Issue: External email activity isn't tracked.

Solution: Get buyer to engage in platform - log in, comment, complete tasks.

"I updated 10 action items but Collaboration didn't change"

Issue: Updates to action items don't count separately (only add/complete count).

Solution: Complete action items, not just update them. Add new action items.

"Score dropped even though we're active"

Issue: Old activity is decaying faster than new activity is being added.

Solution: Maintain consistent activity. One burst of activity followed by silence will result in declining scores.

"Buyer viewed Decision Site 10 times but login count shows 3"

Issue: Multiple views in one browser session = one session.

Solution: This is normal. System counts sessions, not individual page views.


Best Practices

For accurate tracking:

  1. Log all meetings in platform - don't rely on external calendars unless integrated
  2. Invite buyers to Decision Site - external stakeholders need accounts to track their activity
  3. Use the platform for collaboration - comments, tasks, milestones all count
  4. Set correct roles - buyer vs seller roles affect scoring weight
  5. Be consistent - regular activity beats bursts

For improving scores:

  1. Focus on high-impact activities first (meetings, mutual plan, buyer engagement)
  2. Get buyers active in platform - their actions count 1.5x
  3. Maintain weekly cadence - prevents decay
  4. Multi-thread early - add stakeholders to boost Diversity
  5. Organize tasks - assign and set dates for Organization boost

Next Steps


Questions about specific activities? See FAQ or Common Issues