Jump to content

Leaderboard


Popular Content

Showing content with the highest reputation since 07/11/2018 in all areas

  1. 1 point
    djk44883

    Who testmy.net is

    Thanks for your reply. I just didn't see any ads to know who supports the test. Still you claim "proprietary method is proven to help identify issues other speed tests fail to detect." without telling up what issues it helps me identify. Can I really find info on a propriety method? It just sounds vague and generalized. All that aside, I do find testmy.net to be one of the more reliable "speed test" sites I use. THANK YOU for all the work you have put into it. DJ
  2. 1 point
    BOR15

    2 graphs

    OK on closer inspection i think i worked out the top graph. It is the analysis of the download to see how consistent the actual speed of the download is.
  3. 1 point
    CA3LE

    Give Reporting Include and Exclude Functions

    I'll keep this in mind when I start on the database searching improvements. Development on that will start shortly after the new beta is released. What I was planning on doing is having it aggregate the user agent information. So it will search through your result details, group the user agents and allow you to do more detailed searches. You'll also be able to select multiple identifiers. So you can select "Android & iOS" for instance or select the inverse to see only desktop and laptop results. If you want to be part of the beta group, vote "Yes" on this topic >> It's an old topic from the last beta of the version you're using today... a fresh beta is coming soon.
  4. 1 point
    CA3LE

    Fail gracefully when very bad connection

    Update: The new upcoming version of TestMy.net will not fail on all major updated browsers. I've implemented service worker which enables TestMy.net offline... obviously you won't be able to test in that state but it is helping me to make this correct and more useful for you (and myself). Gives me abilities as a developer that I've never had before. I've run numerous batches of tests, purposely disconnecting the internet. The new version has never failed. It will log those events too, doesn't right now but it will. I like your idea for implementation... It will detect when you're online, wait.. check again to make sure it wasn't just intermittent... then test again when it feels it's able to..... might not be able to complete the entire test still but it will try then possibly fail again and restart the re-try process. I can tell the averaging and database listing programs as a whole to ignore exactly "zero" so it doesn't effect host averages or flood public results... I don't know what would happen other wise so better plan for it. Especially since that's a point for clients to purposely negatively affect hosts numbers. (I always want to limit the input received but users... sorry users. Damn bots and hackers ruined your rep.) ...that idea provides a simple way to implement without having to add databases or change database structure. I like that. --- database structure changes can be an extreme headache, especially since I promised since the beginning that I'd retain all of your old results. I'll keep developing. The update I'm on right now started purely as a design venture, the more major feature rich updates are still planned...but I've been stumbling on so many answers it's become far beyond just what people see at this point. There is still so much planned, I'm only including the features that I feel are ready. ... I'd rather be on this new site (all day long). I make TMN first as something I want to use... but I'm hoping to attract more than people of just own own mindset. Anyone still reading this is probably of that mindset. Most will never get as far as you. I want the other 98% to get it too. Hopefully this gets a little closer. For people who have been long time visitors, keep your old browsers on virtual machines. I need your help testing old browsers but also keep in mind that I'm developing for the future, I'm no longer concerned about lagging browsers or wasting my development time around their inconsistencies. I target technologies native to the most popular browsers across the board. If one browser has janky implementation, I may still release it as long as it's not detrimental to the test results or experience. IMHO, Safari is the new IE right now. Annoying. Chrome, Firefox, Chrome on iOS and Android (period) are the best right now. I expect Safari to catch up, sorry, it's not on the developer when everyone else agrees. By the way, I straight up killed most old browsers when I went full SSL. Full SSL is 100% necessary for the future of what TMN is doing. Sorry old browsers. Trust me, I took a hit with traffic. When I see software changing ads in browser or including ads to pages I don't have ads on... that showed me early why we need SSL (https). With SSL that can only happen if the cypher is cracked. To be honest at first I thought, why would TMN need https? Nobody is buying anything from my site or really sending anything truly secure. Uh, think again. Many people may use the same passwords. Hackers can grab that information as you login at a public wifi and then try the same email address and password on google, facebook, ebay, amazon, etc... till they get a hit. So stupidly easy for hackers. Not only that but third party ads may be doing things you don't want... far outside of the scope of common practice (of which some people already may not already want). And again, a third party program can also edit your webpages and add their own content. Above all, they want to make money, usually ad or code insertion is the intent. With the latest SSL people can't do the same malicious stuff. Not to say it can't be done again, it can always be done again. Nothing made by man can be protected from man. What is created from one man's mind can always be decoded in another's. We just need to evolve with the changes. way off the subject... AI teaching itself to make new cyphers and then keeping the true keys from us, that worries me the more I think about it. An AI or AIs will invent their own language that we can't possibly understand in our lifetimes. It will be so efficient that humans can't understand it because we don't work well enough, lol. Take the highest level of cryptography that you can think of exponentially increase that. And then realize that the program that creates it may be so obscure that it could be hiding information from us, making us feel secure when it really has all the power. I say let us humans keep making mistakes and figuring out each other's mistakes and improving naturally. We as humans should always in great detail fundamentally understand our software and hardware changes... it should never be obscured to the point that no human can understand. I worry that we'll let computers do our programming and designing to the point that we'll have little understanding of what's truly under the hood or how to control it. "let the computer figure out how to do that better, it will make it perfect." -- it's cool until it starts talking in a language you can't decode and decides you're not a part of understanding that language simply because you as a human aren't optimal to the system. To a computer, even our VERY best languages are stupid because they aren't optimal and so that's the first thing to go in my opinion. This has already happened by the way, just not to a serious degree. A real AI will be ahead of our moves before we even start moving. Once you know it's happening, it's too late. I'm have to be high on the AI's list now. (haha) I believe our community here is made up of people who set the standard for their own communities in regards to all things tech. We should set the standard by example. Have your main machines always updated and encourage friends to do the same. Hardware and software. Especially software. And especially right now because there have been so many major updates that EVERY browser is agreeing on. They don't normally agree like this. Over 21 years developing in the browser and I've never seen such wide adoption of so many cool new features. What a great time to develop. Anyone who may be still actually reading this and wants access to the beta, just PM me. If you were a member of any discussion on TMN prior to this post you can have access to the early beta too. A handful of our veteran members have agreed to help me even early to make sure we give you a clean release but there are always more bugs that we need help finding. -- we'll find 'em together ...and make some more in the process! Human's Rule. -D
  5. 1 point
    Sean

    Host Graph

    To me, that seems like either high packet loss or intermittent drop-outs, both which you can check by running an extended ping test while streaming. On a Windows PC, open a command prompt (Start -> Windows System -> Command Prompt) and type the following command: ping -t 8.8.8.8 You should see a continuous run of "Reply from 8.8.8.8". Leave that window open and start streaming a programme. As soon as the streaming stalls, check the window for any "Request timed out" lines. If you see three or more in a row, your connection had a brief outage. Press CTRL + C on the keyboard to stop the ping utility. Look at '% loss' figure. If that is 1% or higher, scroll up by holding the mouse down on the top-right up-arrow and look out for lines that say 'Request timed out'. If you see three or more grouped together, that is another brief outage. If you come across five or more 'Request timed out' lines in one screen-full, this is a high packet loss issue, which can also interrupt streaming. I've had an issue in the past with at least 3 D-Link routers failing with a high packet loss with this exact symptom, i.e. no issue with browsing or speed tests, but could not reliably stream YouTube.
×
Speed Test Version 15.9
© 2018 TestMy Net LLC - TestMy.net - Terms & Privacy