PRODUCT TEARDOWN

Glints TapLoker (Product Teardown)

A product teardown on user activation and post-application experience.

Client Glints
Framework Double Diamond
Status Completed
Prefer the short version? View PDF Slides → Or scroll for the full story below.

User Activation & Post-Application Experience

A product teardown by Juan Tombeng | March 2026


Why Two Problems?

Most teardowns pick one issue and go deep on it. This one covers two, and that is a deliberate choice, not a scope problem.

The job seeker journey on Glints TapLoker has three stages: getting set up, discovering and applying, then waiting for a response. The middle stage is where Glints genuinely excels. The one-tap apply, the standardized job feed, the mobile-first design. All of these work, and they work well. But the first and last stages have friction that quietly undermines the experience the product is trying to build.

Onboarding is where first impressions are formed and where the data that powers the whole platform gets collected. Post-application is where trust is won or lost. If users do not get through onboarding cleanly, they never reach the product’s core value. And if they do reach it but get stuck in a communication dead zone after applying, they disengage without filing a complaint, which is the worst kind of churn to catch. Fixing the middle of the funnel without addressing both ends means the platform is still leaking from two directions.

This teardown covers both.

Product Overview

Glints is a Southeast Asian career platform targeting early-to-mid career professionals, particularly Gen Z and younger Millennials who are entering or pivoting within the job market. The Indonesian product, Glints TapLoker, is built around one core hypothesis: job searching should feel as effortless as scrolling.

That single idea explains most of the product’s consequential decisions. The mobile-first experience, the swipeable job feed, and the one-tap application, all of it traces back to that hypothesis. And for the core loop, it works. The product feels more like a content app than a job board, which is exactly what this demographic responds to.

Purpose: To realise human potential for all people and organizations.

Vision: To impact 100 million careers and 1 million organizations, and to become the number one talent platform in Greater Southeast Asia.

Product Strengths

The mobile-first strategy is a real strategic bet. Most products end up on mobile because they have to. Glints went mobile because their users’ decision-making journeys happen on phones. They discover jobs while commuting, apply during lunch, follow up between meetings. A desktop-first platform creates friction at every one of those touchpoints. Glints accepted weaker SEO discoverability in exchange for a better experience on the channel that actually matters to their target segment. That is a mature, intentional tradeoff.

One-tap application solves the right problem. The exhausting part of job hunting is not reading job descriptions. It is filling out the same information such as name, education, work history, expected salary, across different platforms. Glints front-loads that friction at profile creation and then collapses the actual apply action to a single tap. The user’s job is not to fill out a form. Their job is to get in front of a hiring manager. Every second spent on form-filling is time not spent doing the actual thing. The one-tap apply closes that gap.

Standardized employer templates protect the candidate experience. Requiring employers to post in a structured format feels like a constraint from the employer’s side. From the candidate’s side, it removes the cognitive load of parsing wildly inconsistent job descriptions: no consistent format, unpredictable detail levels, salary ranges sometimes present and sometimes absent. Glints removes that variability. In a two-sided marketplace, this is the correct call, optimize for the side with the most alternatives, which is the candidates.

Competitive Landscape

CompetitorTypeCore StrengthWhere Glints WinsWhere Glints Loses
JobStreetDirectBrand trust, listing volume, employer relationshipsSpeed to apply, mobile UX, design qualityLegacy employers who default to JobStreet out of habit
KalibrrDirectSkill assessments, structured screeningCleaner candidate UX, faster interaction loopDepth of skills verification
Tech in Asia JobsDirectTech-sector targeting, editorial contentBroader industry coverage, consumer-grade app feelNiche tech community depth
LinkedInIndirectProfessional network, social proof, communityLower friction for casual discoveryNetworking depth, content ecosystem, global reach

Glints’ defensible position is speed of application combined with mobile UX quality. The problem is that this moat is narrow. Any of the above could close the gap with a focused engineering sprint. Long-term defensibility probably depends on building a data and community layer that is harder to replicate, and that is exactly why the two friction points in this teardown matter beyond their surface-level UX impact. A broken onboarding reduces data quality across the entire matching algorithm. A broken post-application experience reduces the behavioral signal Glints needs to build a recommendation engine worth defending.

Business Model

Glints monetizes primarily on the B2B side. Job seekers use the platform for free while employers pay for access to the candidate pool through premium listings, proactive talent search tools, and likely a performance-based hiring fee at enterprise tiers.

This is the correct structure for the market. Charging early-career candidates (Glints’ primary segment), would immediately damage acquisition. The employer-side value only exists because the candidate pool is large and active. Keeping the candidate side free protects the top of the funnel.

The risk that comes with heavy B2B dependence is that product decisions drift toward employer satisfaction at the cost of candidate experience. The “Chat Initiated” problem covered later in this teardown is a direct symptom of that pressure. The platform surfaces employer-initiated contact as an active status even when the employer has gone silent, which protects the employer’s perceived responsiveness at the cost of the candidate’s trust. That is the wrong tradeoff, and it compounds over time.

User Personas

Jason, Recent Postgraduate, 24

Jason finished his undergraduate degree, worked for two years in a full-time role, then went back to complete a master’s degree. He is now re-entering the job market with roughly two years of actual work experience, though the timeline is non-linear. His goal is to apply efficiently across multiple roles and compare options without filling out the same forms repeatedly.

His pain point hits at onboarding. The “Date of Joining Workforce” field asks when he first entered the workforce, then calculates his total experience from that date to today. It does not account for the two years he spent in postgraduate study. The result is an overstated experience figure, and Jason knows it. He is being asked to submit data he knows is inaccurate, which puts him in an impossible position before he has applied to a single job.

Job to be Done: Help me accurately represent my non-linear career history in your standardized system so I can qualify for the right roles without submitting misleading data.

Clarissa, Entry Level Digital Marketer, 22

Clarissa is actively job hunting and has applied to several roles through Glints. She checks the app regularly and depends on notifications to stay on top of her applications. She needs a mobile-first platform where she can track application status and communicate with hiring managers without switching to email.

Her pain point is post-application. She has multiple conversations sitting in “Chat Initiated” status with no follow-up from the employer. There is no way to distinguish the live conversations from the dead ones. She keeps checking back anyway because the status implies someone is engaged. Gradually, her sessions get shorter. Eventually, she opens the app less often, not because the platform failed her dramatically, but because it stopped being worth checking.

Job to be Done: Tell me which opportunities are still live so I can focus my energy where it actually matters.

The Growth and Retention Loop

Glints’ product loop depends on two things working in sequence: candidates completing strong profiles, and candidates receiving enough positive signal post-application to keep returning. These are not independent. A weak profile leads to mismatched applications, which leads to fewer employer responses, which leads to the exact “Chat Initiated” dead zone that drives disengagement.

The two friction points in this teardown sit at the beginning and end of that loop. The “Date of Joining Workforce” problem undermines profile quality at the entry point. The “Chat Initiated” problem undermines engagement quality at the exit point. Together, they represent a loop that is quietly leaking at both ends while the middle looks healthy on surface metrics.

Problem 1: The Onboarding Friction | “Date of Joining Workforce”

During onboarding, users complete three sections: My Profile, Resume, and Job Preferences. Inside My Profile, one of the required fields is “Date of Joining Workforce.” When a user fills this in, the system calculates total years of experience by counting forward from that date to today.

The assumption baked into this logic is continuous employment. For Jason, who took a two-year break for a master’s degree, the calculation overstates his experience by exactly the length of his education gap. He has no way to correct this within the current flow. His options are to submit an inaccurate number, guess at what date would produce the right figure, or abandon the profile section entirely.

Why This Happened

Calculating experience from a single timestamp is efficient for backend database queries. A single date field is clean, lightweight, and easy to index for the matching algorithm. This was a sensible engineering decision. The problem is that it was never revisited from the candidate’s perspective. The system treats career history as a straight chronological line, but a growing segment of candidates, like those with graduate education, career pivots, or intentional breaks do not follow straight lines.

This is a system-centric design decision that created a front-end accuracy problem. The product optimized for query speed and got inaccurate user data as the tradeoff.

The Impact

Inaccurate experience data does not just affect the individual candidate. It degrades the matching algorithm’s output for every employer who receives that candidate’s profile. At scale, this kind of systematic data quality issue compounds quietly and becomes very difficult to attribute to its root cause.

Problem 2: The Post-Application Dead Zone | “Chat Initiated”

After a candidate applies, Glints surfaces a status called “Chat Initiated” when an employer opens contact. In practice, this status often persists indefinitely, including when the employer stopped responding days or weeks ago.

For Clarissa, this creates a specific emotional trap. The status implies active engagement, but the reality is silence. She has no resolution path, no way to tell the dead conversations from the live ones, and no clear signal to move on. Her inbox accumulates more of these conversations over time, and every session has a higher chance of ending in nothing.

Why This is a Product Problem, Not Just a UX Annoyance

The “Chat Initiated” status makes an implicit promise: someone wants to talk to you. When the platform cannot keep that promise because it has no mechanism for resolving employer silence, it erodes candidate trust at the exact moment that trust is most important to build.

Three things happen simultaneously as these conversations accumulate:

Platform trust degrades. The status said one thing and the experience delivered another.

Inbox signal-to-noise deteriorates. Dead leads stack up alongside real opportunities, making it harder to distinguish what actually deserves attention.

Passive churn accelerates. Clarissa does not leave with a complaint. She just opens the app a little less each week. This is the hardest churn to catch because it looks like natural disengagement rather than a product failure.

The job-search context makes this especially tricky to measure. Users naturally reduce frequency when they land a job, which looks like churn but is not. The real signal, users who are still actively searching but opening the app less, gets buried under normal post-hire disengagement. The Status Expiry feature addresses this not just as a UX fix but as a data-cleaning exercise that makes the real retention signal visible.

Proposed Solutions

Solution 1: Career Break Toggle

The fix: Decouple the experience calculation from the single “Date of Joining Workforce” timestamp.

Introduce an optional “Career Break / Further Education” toggle beneath the start date input. When activated, it reveals a field where the user can manually enter their total active working years, overriding the automatic chronological calculation. The backend still receives a clean number for matching purposes, which the number is just accurate now.

The UI change is minimal. The engineering change is a conditional override in the experience calculation logic, not a new data model. The payoff is that candidates with non-linear career paths can represent themselves accurately, which benefits both sides of the marketplace.

What was considered and rejected: Asking users to enter each job’s start and end date individually would produce more granular data but add significant friction to an onboarding flow that the product has deliberately kept lightweight. The toggle approach preserves the simplicity of the existing design while fixing the accuracy problem.

Solution 2: Application Status Expiry

The fix: A time-based mechanism that automatically transitions stale “Chat Initiated” conversations to a resolved state when no employer activity occurs within a defined window.

Day 0:    Candidate applies > Status: "Application Sent"
Day 1:    Employer opens chat > Status: "Chat Initiated"
Day 14:   No employer reply > Auto-transitions to "Inactive"
           Push notification: "No reply received (this application has been moved to Inactive)"
           Candidate can re-activate or archive manually
Day 30:   No action taken > Status: "Closed (No Response)"
           Fully archived, cleared from active inbox

Notification language matters. The copy needs to feel empowering, not like a rejection letter. “No response received, clear this from your list?” works better than “This employer did not reply.” The former gives the candidate agency. The latter assigns blame.

Two additional mechanics make this feature stronger. First, an optional employer nudge: “3 candidates are still waiting on a response, reply or close to keep your response rate healthy.” This improves employer accountability without punitive enforcement and creates a positive feedback loop on their side. Second, a “Similar roles you might like” prompt when a conversation is archived, turning a potentially deflating moment into a re-engagement touchpoint.

Tradeoffs accepted:

TradeoffReasoning
Some employers may miss a candidate they planned to follow up withA 14-day window is generous. Employers with genuine intent have time. The feature also sends a nudge before archiving.
Adds complexity to the application state machineThe backend logic is a scheduled job comparing timestamps and updating a state column. This is a solved engineering pattern, not a new system.
Risk of false closure if a real conversation is auto-archivedMitigated by the candidate’s ability to re-activate and the employer nudge that fires before the transition.

RICE Prioritization

FeatureReachImpactConfidenceEffortRICE Score
Application Status Expiry830.829.6
Career Break Toggle620.9110.8
In-App Employer Response Timer (public)430.641.8

Reach: estimated percentage of MAU affected, out of 10. Impact: 1 to 3. Confidence: 0.0 to 1.0. Effort: person-weeks.

Career Break Toggle (10.8): Reach is 6 because non-linear career paths are common but not universal. Impact is 2 because this is an activation problem. It improves data quality and profile completion but does not directly affect how often users return. Effort is 1 week because it is a conditional input field and a calculation override. The math works in its favor.

Application Status Expiry (9.6): Reach is 8 because virtually every active user who applies will eventually hit a “Chat Initiated” conversation that goes nowhere. Impact is 3 because this directly addresses platform trust, which is the underlying driver of passive churn. Effort is 2 weeks because the backend mechanism, a scheduled job that updates application state based on timestamp delta, is a well-understood engineering pattern.

The Career Break Toggle scores higher on RICE but addresses an activation problem. The Status Expiry addresses a retention problem. Given that Glints’ long-term value depends on a dense and returning candidate pool, retention arguably deserves priority. But this is a judgment call that ultimately depends on where the funnel is leaking most, which only internal data can settle. The right answer is to run both, as they do not compete for the same engineering resources.

Success Metrics

Career Break Toggle

Primary: Drop-off and bounce rate reduction specifically on the work experience input screen. If fewer users abandon the step or submit visibly incorrect data, the feature is working.

Secondary: Increase in overall profile completion rate to three-star status, which is the threshold that activates the one-tap apply feature. Profile completion is the direct upstream dependency of the product’s core value proposition.

Application Status Expiry

Primary: Reduction in user-reported confusion about application status, tracked via in-app feedback, support tickets tagged “application status,” and app store review sentiment analysis. Target: 20% reduction in status-related complaints within 60 days of launch.

Secondary: Improvement in DAU/MAU ratio. If the inbox is cleaner and each session surfaces more signal, users return more often because the app is actually worth checking. Target: 5% improvement within 90 days.

Guardrail: Account deactivation rate. If users feel the platform is prematurely closing their opportunities, the 14-day window may be too aggressive or the notification copy may be wrong. This metric should not move.

Leading indicator (weekly): Ratio of “Chat Initiated” conversations that transition to an active reply versus expiry. If more than 60% are expiring without any employer reply, the problem is validated at scale and the feature is catching genuine dead leads.

Key Takeaways

Two-sided marketplace decisions are never neutral. Every product choice tilts toward one side or the other. The “Chat Initiated” problem is what happens when the employer side’s silence is allowed to define the candidate’s experience. The platform was optimized around the employer’s workflow, and the candidate paid for it in eroded trust.

System-centric design creates real user costs. The “Date of Joining Workforce” field was a reasonable backend engineering decision that became a front-end accuracy problem. Calculating experience from a single timestamp is efficient. Surfacing that number as the user’s actual experience, with no correction mechanism, is where the product decision went wrong.

Silent churn is the hardest metric to catch. Users naturally disengage when they land a job, which inflates apparent churn in job-search apps. The real signal is users who are still searching but opening the app less. The Status Expiry feature is partly a UX fix and partly a data-quality improvement that makes the real retention signal readable by removing the noise of dead conversations.

Engineering background is a product asset when applied correctly. Recognizing that the experience calculation problem was rooted in backend query design, and that fixing it required a conditional override rather than a new data model, is the difference between a feature spec that engineers trust and one they push back on. Understanding what makes something easy or hard to build is part of the job.


Portfolio case study: competitive analysis, user research synthesis, feature specification, and RICE prioritization. Product: Glints TapLoker (Indonesia). All estimates are based on publicly available product behavior and inferred from standard industry benchmarks.