Replit ask user statistics

Display the users’ stats, posts, and other information in the html response so that I can easily split and format them for my own use. So far, the only things I’ve been able to obtain are the users’ replit, bio, and avatar without using chromedriver or any other web automation programming tool.

1 Like

Hi @functionally can I ask what you would like to use these for? It might help others in the community decide if they want to vote for the #feature-requests

1 Like

Cant wait to see what you can make. As for the info you can use an endpoint for user info without needing an API key or staff Get a public list of users.

For posts, you can use this endpoint for all the replies on a specific topic. Unfortunately, all the others require a key. But you can always try and scrape web pages for data.

Also feature requests and bug reports go in #general 99.9% of the time. There are a few exceptions I thought of but there are near 0

1 Like

Why? What’s the point of their own categories then?

1 Like

@functionally you can get some basic statistics in the about page

1 Like

Well other stuff also goes in #general like polls or general Replit or programming disscusion.

1 Like

I wanted to build a discord bot that could look up a replit ask username and retrieve all of the user’s public information. But I can’t because I’d need a web automation tool, so I’m requesting that the developers include the users’ stats, posts, and so on in the html response so that I can easily split and format it for the bot.

My code:

# from replit import db
import requests
import os

baseUrl = "[user]"

def scrapeUser(user: str):
    if user != None:
        info = requests.get(
            baseUrl.replace("[user]", user),
        if info.status_code == 200:
            bio = info.text.split('<meta property="og:description" content="')[1].split(
            avatar = info.text.split('<meta property="og:image" content="')[1].split(
            jsonData = {
                "basic": {
                    "username": user,
                    "avatar": avatar,
                    "bio": bio,
                "stats": {"posts": ""},
            return jsonData
            return "failed to retrieve information."

def main():
    username = "functionally"
    inform = scrapeUser(username)


Unfortunately you would need to scrape the webpage for Ask. But the main site you can use graphQL


This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.