Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

LAR_Systems

Member
  • Content Count

    612
  • Joined

  • Last visited

Reputation Activity

  1. Informative
    LAR_Systems got a reaction from marknd59 in F@H Web Client Dark Skin with PPD comparison by GPU / WU   
    Just as a follow up, some changes have been made to how the charts are implemented for when you toggle them on and off.   The F@H "slot id" id dynamically assigned and the client when you tried only supported up to the number 8... issue is F@H if you add and remove devices in the same session can increment the number to be larger than 8 and then you don't get chart even if there is only one device folding.

    So little re-work on the code and now to account for this it can dynamically create charts up to 32 slot IDs for those that end up with the high IDs from trying stuff in the client or if you have a banging system with sooooo many GPUs 😉 

    Anyways try it now to see if the charts show,

    Cheers,
  2. Informative
    LAR_Systems got a reaction from TVwazhere in LTT "Summer" Folding Sprint   
    The people reporting numbers in the 2 Mil range are currently running on target, looks like PPD reward / WUs have changed over the past two weeks for the card which happens when projects that reward heavy are replaced by something else.

     
  3. Informative
    LAR_Systems got a reaction from GOTSpectrum in LTT "Summer" Folding Sprint   
    The people reporting numbers in the 2 Mil range are currently running on target, looks like PPD reward / WUs have changed over the past two weeks for the card which happens when projects that reward heavy are replaced by something else.

     
  4. Like
    LAR_Systems got a reaction from Favebook in F@H Web Client Dark Skin with PPD comparison by GPU / WU   
    Hi,
    There are multiple 1070s and other GPUs with duplicate records because they will be slight variations on the model as identified by F@H client (and shown under the card name) so the records are kept separate as the different revisions, mobile units etc. behave differently with different firmware/hardware.
     


    In terms of the ranks, they are based on real-time data that is collected form users using the extension and are averaged across different work units and projects which have different PPD rewards that can and do change.
     
    As for you card reporting low PPD.   The image in your post appears to be broken so can't see what card / issue you are showing... but if you are seeing dramatically lower PPD and you are new to folding you may not have an API key setup which will give you far less PPD than those values reported in the database as they only record PPD values for users with a F@H API key.

    Otherwise I would need to see the issue / know what hardware and work unit you are on to review further.

    Cheers,
  5. Like
    LAR_Systems reacted to GOTSpectrum in F@H Web Client Dark Skin with PPD comparison by GPU / WU   
    Thank you for all your hard work LAR!
  6. Like
    LAR_Systems got a reaction from GOTSpectrum in F@H Web Client Dark Skin with PPD comparison by GPU / WU   
    Hi,
    There are multiple 1070s and other GPUs with duplicate records because they will be slight variations on the model as identified by F@H client (and shown under the card name) so the records are kept separate as the different revisions, mobile units etc. behave differently with different firmware/hardware.
     


    In terms of the ranks, they are based on real-time data that is collected form users using the extension and are averaged across different work units and projects which have different PPD rewards that can and do change.
     
    As for you card reporting low PPD.   The image in your post appears to be broken so can't see what card / issue you are showing... but if you are seeing dramatically lower PPD and you are new to folding you may not have an API key setup which will give you far less PPD than those values reported in the database as they only record PPD values for users with a F@H API key.

    Otherwise I would need to see the issue / know what hardware and work unit you are on to review further.

    Cheers,
  7. Like
    LAR_Systems got a reaction from RollinLower in F@H Web Client Dark Skin with PPD comparison by GPU / WU   
    Power outage for the API server today... good times.  Back up now. 
  8. Informative
    LAR_Systems got a reaction from marknd59 in F@H Web Client Dark Skin with PPD comparison by GPU / WU   
    Power outage for the API server today... good times.  Back up now. 
  9. Informative
    LAR_Systems got a reaction from marknd59 in F@H Web Client Dark Skin with PPD comparison by GPU / WU   
    I saw this, the issue is the project throws a 404 on their API, like they were handing out work units but did not add it to the project list, so I can't get the ID, to store the data correctly, once F@H ads it' to the DB for the API, my end will pick it up and associate all the back data.   It's why their project description window does not load as well.
  10. Like
    LAR_Systems got a reaction from Baha in BOINC Pentathlon 2021   
    I just dropped a 2950TR on SI and a RTX 3090 will run on Prime for a couple days.  Hope this makes a dent.
  11. Like
    LAR_Systems reacted to GOTSpectrum in BOINC Pentathlon 2021   
    Massive thanks
  12. Like
    LAR_Systems got a reaction from Gorgon in BOINC Pentathlon 2021   
    I just dropped a 2950TR on SI and a RTX 3090 will run on Prime for a couple days.  Hope this makes a dent.
  13. Like
    LAR_Systems got a reaction from MrFinn741 in BOINC Pentathlon 2021   
    I just dropped a 2950TR on SI and a RTX 3090 will run on Prime for a couple days.  Hope this makes a dent.
  14. Like
    LAR_Systems got a reaction from GOTSpectrum in BOINC Pentathlon 2021   
    I just dropped a 2950TR on SI and a RTX 3090 will run on Prime for a couple days.  Hope this makes a dent.
  15. Like
    LAR_Systems reacted to J-from-Nucleon in BOINC Pentathlon 2021   
    Ayyy, I hit 25,000 and it's my birthday

  16. Funny
    LAR_Systems got a reaction from MrFinn741 in BOINC Pentathlon 2021   
    The real reason new hires wait weeks to get their email and shared network drive 😉
  17. Funny
    LAR_Systems got a reaction from marknd59 in BOINC Pentathlon 2021   
    The real reason new hires wait weeks to get their email and shared network drive 😉
  18. Like
    LAR_Systems got a reaction from Baha in BOINC Pentathlon 2021   
    Afraid not... means I can't do development on the folding DB / Web client, need my systems running for testing and some of the uptime tests I have setup.
    Have the python terminal based client prototype running, but needs work.
  19. Like
    LAR_Systems got a reaction from Baha in BOINC Pentathlon 2021   
    I have 25 threads running on the Marathon project.   We will see how badly the folding DB suffers as it's that server doing the lifting 😉
  20. Funny
    LAR_Systems got a reaction from GOTSpectrum in BOINC Pentathlon 2021   
    The real reason new hires wait weeks to get their email and shared network drive 😉
  21. Like
    LAR_Systems got a reaction from Gorgon in BOINC Pentathlon 2021   
    I have 25 threads running on the Marathon project.   We will see how badly the folding DB suffers as it's that server doing the lifting 😉
  22. Funny
    LAR_Systems got a reaction from leadeater in BOINC Pentathlon 2021   
    The real reason new hires wait weeks to get their email and shared network drive 😉
  23. Funny
    LAR_Systems reacted to leadeater in BOINC Pentathlon 2021   
    Well 5 of those newer servers literally turned up yesterday and I can delay putting them through to production, for how long I don't know lol
  24. Like
    LAR_Systems got a reaction from leadeater in BOINC Pentathlon 2021   
    Afraid not... means I can't do development on the folding DB / Web client, need my systems running for testing and some of the uptime tests I have setup.
    Have the python terminal based client prototype running, but needs work.
  25. Like
    LAR_Systems got a reaction from GOTSpectrum in BOINC Pentathlon 2021   
    I have 25 threads running on the Marathon project.   We will see how badly the folding DB suffers as it's that server doing the lifting 😉
×