Skip to content

Instantly share code, notes, and snippets.

@yetanotherchris
Last active October 27, 2024 20:40
Show Gist options
  • Select an option

  • Save yetanotherchris/3bb7af558f38340efccbba1ea4360210 to your computer and use it in GitHub Desktop.

Select an option

Save yetanotherchris/3bb7af558f38340efccbba1ea4360210 to your computer and use it in GitHub Desktop.

Revisions

  1. yetanotherchris revised this gist Oct 27, 2024. 1 changed file with 5 additions and 0 deletions.
    5 changes: 5 additions & 0 deletions valve-trust-score.md
    Original file line number Diff line number Diff line change
    @@ -77,6 +77,11 @@ _A few observations of my own:_
    1. The trust system seems to be easily gamed by high-value skin purchases.
    1. Since 2020 the player age appears to have dropped a lot for CS2 which is a PEGI 16.
    Many pre-high school age players need a different psychological model for trust than the one Valve uses, one that is more punative.
    1. Most of the CS2 aimbotting seems to have gone, in the mid-ranks.
    1. The estimate of about 40% player base cheating (wallhacking) is closer to 50% in competitive mode.
    1. The profile makes a difference to who you are matched against/with, although the RNG is still more prominent using the player-base. My tip is put "faceit level 8" and "nice to meet" you in the profile, with a custom friendly looking avatar to enter the weird world of boosting lobbies. Use some kind of neutral in-built avatar and name to sit with the normal pool.
    1. The algorithm matches you with people who are using your play-style, e.g. shotgun only. One tip for this: use a lot of flashes in a game and you'll be matched with people who know how to use the grenades ("util").
    1. Maybe in the future Microsoft's Windows Core Isolation (memory-integrity) feature will render the cheats unusable, if Valve make it a requirement.

    I have a feeling that Valve use clustering and/or Apriori/FP-Growth style matching for players, and have also a limited set of clusters to choose from - similar to the Youtube algorithm where you were matched to existing "actors". For example you can get put
    into an "AFK" player cluster, "griefing" player cluster, and often they're combined. This seems to happen when there are
  2. yetanotherchris revised this gist Oct 15, 2024. 1 changed file with 11 additions and 14 deletions.
    25 changes: 11 additions & 14 deletions valve-trust-score.md
    Original file line number Diff line number Diff line change
    @@ -34,16 +34,14 @@ This is slightly contradicted by the following,
    > existing matchmaking technology, which uses static rules to determine
    > the trust levels of users
    The system seems to be a Reinforcement learning from human feedback (RLHF) system to me, at least in CS2.
    The system seems to be a Reinforcement learning from human feedback (RLHF), at least in CS2.
    It also seems to be influenced by in-game report gaming and consquently the
    [Just World Fallacy](https://en.wikipedia.org/wiki/Just-world_fallacy). The reality, in the case of Counter Strike (which
    is alluded to in the patent):

    > One popular video game genre where players often play in multiplayer mode is the first-person shooter genre
    Valve are built a trust system for a prison-inmate type population, and using negative reinforcement rather
    than Blizzard's approach in Overwatch of positive reinforcement. The more cynical commentators would suggest Valve will
    always be more interested in the flow of game case sales, which promotes Steam market game sales, than cleaning up their
    is that Valve have built a trust system but for a prison-inmate type population (anti-social traits), and are using negative reinforcement rather than Blizzard's approach in Overwatch of positive reinforcement. The more cynical commentators would suggest Valve will always be more interested in the flow of game case sales, which promotes Steam market game sales, than influencing their
    player base for games. Much like a Casino isn't too interested in enforcing anti-gambling moral behaviour on its regulars.

    From the US patent, Valve are be basing the trust score on:
    @@ -70,22 +68,21 @@ From the US patent, Valve are be basing the trust score on:

    _A few observations of my own:_

    1. You high score in your first game of the day. Your trust score decreases from this (and/or people report you), so your next match is against low trust players.
    - You score low in this game, confusing the model further into thinking your next game should be against low trust players because there is such a big discrepency.
    1. You get 25+ kills in your first CS2 game of the day, their is a skill rating mismatch. Your trust score decreases from this (and/or people report you), so your next match is against low trust players.
    - You score low in this next game, confusing the model further into thinking your next game should be against low trust players because there is such a big score discrepency.
    1. You don't have a microphone enabled for various reasons in the game, and Valve think after a certain rating you should have a mic, so trust lowers.
    1. You play against people who game the trust system via only playing 1 match a day or week, but own multiple accounts.
    1. Being punished by the model for playing multiple games, as this is booster lobby (cheating) behaviour.
    1. You play against bad actors who game the trust system via only playing 1 match a day or week, but on multiple accounts.
    1. Ypu get punished by the model for playing multiple (4+) games, as this is booster lobby (cheating) behaviour.
    1. A high trust pool is non-existent during the day.
    1. The trust system seems to be easily gamed by high-value skin purchases.
    1. Since 2020 the player age appears to have dropped a lot, when CS2 (for example) is a PEGI 16.
    1. Since 2020 the player age appears to have dropped a lot for CS2 which is a PEGI 16.
    Many pre-high school age players need a different psychological model for trust than the one Valve uses, one that is more punative.

    I have a feeling that Valve use clustering or Apriori/FP-Growth style matching for players, and have also a limited set of clusters to
    choose from - similar to the Youtube algorithm where you were matched to existing "actors". For example you can get put
    I have a feeling that Valve use clustering and/or Apriori/FP-Growth style matching for players, and have also a limited set of clusters to choose from - similar to the Youtube algorithm where you were matched to existing "actors". For example you can get put
    into an "AFK" player cluster, "griefing" player cluster, and often they're combined. This seems to happen when there are
    less people online, the parameters are more refined when it's evening peek time. At this time, I've
    found myself in a team with people who appear to have similar profiles to me, except 1 or 2 of the accounts look bought, in that
    they haven't purchased any new games for a long time and their matchmaking history is empty for years.
    less people online, the parameters are more refined when it's evening peek time. At peek time, I've
    found myself in a team with people who appear to have very similar profiles to me, except 1 or 2 of the accounts look bought, in that
    they haven't purchased any new games for a long time and their matchmaking history is empty for years, but on the surface we appear to be a very similar cohort.

    ### Links
    - [Valve using sensor data for Trust score](https://www.reddit.com/r/linux_gaming/comments/v1sygi/notice_valve_is_using_vac_valve_anti_cheat_to_spy/)
  3. yetanotherchris created this gist Oct 15, 2024.
    94 changes: 94 additions & 0 deletions valve-trust-score.md
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,94 @@
    ## Notes on Valve's Trust Score System

    - [The Trust system patent from 2019](https://patents.google.com/patent/WO2020051517A1)
    - [US Trust system patent from 2018](https://patents.google.com/patent/US20200078688A1)
    - [VACnet from 2017](https://patents.google.com/patent/WO2019182868A1)

    See the "Cited By" section at the bottom for other trust score systems - most notably Microsoft's 2005 "Determination of a reputation of an on-line game player".

    ```
    FIG. 2 illustrates examples of other behaviors, besides cheating, which can
    be used as a basis for player matchmaking. For example, the trained machine
    learning model(s) 216 may be configured to output a trust score 118 that
    relates to the probability of a player behaving, or not behaving, in accordance
    with a game-abandonment behavior (e.g., by abandoning (or exiting) the video
    game in the middle of a match). Abandoning a game is a behavior that tends to
    ruin the gameplay experience for non abandoning players, much like cheating.
    As another example, the trained machine learning model(s) 216 may be configured
    to output a trust score 118 that relates to the probability of a player
    behaving, or not behaving, in accordance with a griefmg behavior.
    A “griefer” is a player in a multiplayer video game who deliberately irritates
    and harasses other players within the video game 110, which can ruin the
    gameplay experience for non-griefmg players.
    As another example, the trained machine learning model(s) 216 may be
    configured to output a trust score 118 that relates to the probability
    of a player behaving, or not behaving, in accordance with a vulgar language behavior
    ```

    This is slightly contradicted by the following,

    > [0015] The techniques and systems described herein also improve upon
    > existing matchmaking technology, which uses static rules to determine
    > the trust levels of users
    The system seems to be a Reinforcement learning from human feedback (RLHF) system to me, at least in CS2.
    It also seems to be influenced by in-game report gaming and consquently the
    [Just World Fallacy](https://en.wikipedia.org/wiki/Just-world_fallacy). The reality, in the case of Counter Strike (which
    is alluded to in the patent):

    > One popular video game genre where players often play in multiplayer mode is the first-person shooter genre
    Valve are built a trust system for a prison-inmate type population, and using negative reinforcement rather
    than Blizzard's approach in Overwatch of positive reinforcement. The more cynical commentators would suggest Valve will
    always be more interested in the flow of game case sales, which promotes Steam market game sales, than cleaning up their
    player base for games. Much like a Casino isn't too interested in enforcing anti-gambling moral behaviour on its regulars.

    From the US patent, Valve are be basing the trust score on:

    1. an amount of time a player spent playing video games in general,
    1. an amount of time a player spent playing a particular video game
    1. times of the day the player was logged in and playing video games
    1. match history data for a player- e.g., total score (per match, per round, etc.), headshot percentage, kill count, death count, assist count, player rank, etc.
    1. a number and/or frequency of reports of a player cheating
    1. a number and/or frequency of cheating acquittals for a player
    1. a number and/or frequency of cheating convictions for a player
    1. confidence values (score) output by a machine learning model that detected a player of cheat during a video game
    1. a number of user accounts associated with a single player (which may be deduced from a common address, phone number, payment instrument, etc. tied to multiple user accounts)
    1. how long a user account has been registered with the video game service
    1. a number of previously-banned user accounts tied to a player
    1. number and/or frequency of a player’s monetary transactions on the video game platform
    1. a dollar amount per transaction
    1. a number of digital items of monetary value associated with a player’s user account
    1. number of times a user account has changed hands (e.g., been transfers between different owners/players)
    1. a frequency at which a user account is transferred between players
    1. geographic locations from which a player has logged-in to the video game service
    1. a number of different payment instruments, phone numbers, mailing addresses, etc. that have been associated with a user account and/or how often these items have been changed
    1. and/or any other suitable features that may be relevant in computing a trust score that is indicative of a player’s propensity to engage in a particular behavior

    _A few observations of my own:_

    1. You high score in your first game of the day. Your trust score decreases from this (and/or people report you), so your next match is against low trust players.
    - You score low in this game, confusing the model further into thinking your next game should be against low trust players because there is such a big discrepency.
    1. You don't have a microphone enabled for various reasons in the game, and Valve think after a certain rating you should have a mic, so trust lowers.
    1. You play against people who game the trust system via only playing 1 match a day or week, but own multiple accounts.
    1. Being punished by the model for playing multiple games, as this is booster lobby (cheating) behaviour.
    1. A high trust pool is non-existent during the day.
    1. The trust system seems to be easily gamed by high-value skin purchases.
    1. Since 2020 the player age appears to have dropped a lot, when CS2 (for example) is a PEGI 16.
    Many pre-high school age players need a different psychological model for trust than the one Valve uses, one that is more punative.

    I have a feeling that Valve use clustering or Apriori/FP-Growth style matching for players, and have also a limited set of clusters to
    choose from - similar to the Youtube algorithm where you were matched to existing "actors". For example you can get put
    into an "AFK" player cluster, "griefing" player cluster, and often they're combined. This seems to happen when there are
    less people online, the parameters are more refined when it's evening peek time. At this time, I've
    found myself in a team with people who appear to have similar profiles to me, except 1 or 2 of the accounts look bought, in that
    they haven't purchased any new games for a long time and their matchmaking history is empty for years.

    ### Links
    - [Valve using sensor data for Trust score](https://www.reddit.com/r/linux_gaming/comments/v1sygi/notice_valve_is_using_vac_valve_anti_cheat_to_spy/)
    - [VACnet details](https://www.reddit.com/r/VACsucks/comments/v257ue/since_some_people_are_confused_about_how_vacnet/)
    - [Valve's revolutionary idea for enforcing fair play in online video games - and invading your privacy to profile you across systems](https://www.reddit.com/r/GlobalOffensive/comments/unoxd1/valves_revolutionary_idea_for_enforcing_fair_play/)