Wikipedia:Reference desk/Archives/Computing/2016 March 18

From Wikipedia, the free encyclopedia
Computing desk
< March 17 << Feb | March | Apr >> Current desk >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


March 18[edit]

Quadratic Equation[edit]

[Moved to WP:RD/MATH.] Tevildo (talk) 08:41, 18 March 2016 (UTC)[reply]

Google ping report- what does it show about the connection?[edit]

--- google.com ping statistics --- 472 packets transmitted, 382 received, 19% packet loss, time 476455ms rtt min/avg/max/mdev = 33.745/34.727/169.667/7.180 ms What does this show about the connection? Is 19% loss normal for a good connection? — Preceding unsigned comment added by 14.139.185.2 (talk) 10:18, 18 March 2016 (UTC)[reply]

Most definitely not although it's difficult to know precisely what the problem is from the limited info available. Nil Einne (talk) 14:13, 18 March 2016 (UTC)[reply]
Agree we can't diagnose, but here's some info that might help OP: Most times I get packet loss that high, it is due to interference or weak signal on WiFi. So if this is from a WiFi connection, I recommend trying a wired internet connection. If the problem goes away, you know it was due to WiFi, and if it does not, then you know it was not a WiFi problem. If you can rule WiFi in or out with this method, let us know and we can likely help further. SemanticMantis (talk) 15:42, 18 March 2016 (UTC)[reply]
Another way to test that - for devices that don't have a wired ethernet port - is to ping the wifi router itself. These are often (but not universally) to be found at the IP address: 192.168.1.1. If you still get lots of lost packets - then it's a Wifi problem for sure - if you have a near-perfect connection - then it's something else. It's worth doing the test while you're just a few feet away from the Wifi router - and again at the farthest corners of the space you use it from, behind walls, around corners, with doors closed or open, etc. This will give you a feel for how bad the problem is, and whether a WiFi repeater/range-extender would help. SteveBaker (talk) 20:22, 21 March 2016 (UTC)[reply]

iCloud login[edit]

Is this normal or suspicious: soon after I go online (Firefox), a popup box appears for iCloud login, asking for password; if I cancel, a 2nd identical popup box appears. This seems like possible phishing. Also, I can't find out exactly where this input box originates. Am I being paranoid, or is this something suspicious? — Dynamic IP:2600:1004:B06D:A71E:1920:9331:C647:963C (talk) 20:10, 18 March 2016 (UTC) -- Does this happen to others? — Me again:2600:1004:B06E:E61E:2DF2:3EC3:21EF:B72F (talk) 03:50, 19 March 2016 (UTC)[reply]

Can we assume you are using a windows 10 PC? 175.45.116.66 (talk) 23:55, 20 March 2016 (UTC)[reply]

No, Mac (MacBook Air). -- Dynamic IP:2600:1004:B016:7C7A:5F9:69FF:E30A:CDED (talk) 15:59, 22 March 2016 (UTC)[reply]

Why do I get a white screen?[edit]

My Internet is slow, and this seems to be a bigger problem with my new computer than with the older one. I have an HP 251-a126 Windows 10 64-bit and use Microsoft Edge. If I am going to a site for the first time it takes a while to appear on the screen, but sometimes even with sites I have been to, the URL will appear quickly at the top of the screen and the name of the site will appear above that. Even after the dots have stopped chasing each other in a circle to the left of the site's name, most of the rest of the screen is white, or I am still where I was before. Eventually, the site I think I am on will appear, but I usually have to refresh.— Vchimpanzee • talk • contributions • 21:16, 18 March 2016 (UTC)[reply]

You might be running into a problem with bloatware (that link seems to go to the wrong place). That is, the latest and greatest O/S and browser are likely to require more resources to give the same performance. So, unless you have the latest, most powerful PC, the latest software is likely to be painfully slow. StuRat (talk) 01:38, 19 March 2016 (UTC)[reply]
It's a brand new PC, but when I first go to each site, it is painfully slow.— Vchimpanzee • talk • contributions • 18:03, 19 March 2016 (UTC)[reply]
That sounds like a security scan is being done. You could disable those to speed it up, but then your PC may be at risk. StuRat (talk) 20:30, 19 March 2016 (UTC)[reply]
It might be worthwhile trying a different browser. I'm not sure how bloated Microsoft Edge is. Dbfirs 16:32, 20 March 2016 (UTC)[reply]
Of course, everything is back to normal once I have visited a site and go to different pages on that site. Trying to change browsers for me is not an option. Downloading anything is next to impossible and getting used to new things isn't easy.— Vchimpanzee • talk • contributions • 21:02, 21 March 2016 (UTC)[reply]

Difference Debian / Ubuntu[edit]

I was told that Debian is more conservative in its approach, only including changes that are stable. That's contrary to Ubuntu, which would add new features faster. Debian "doesn't have all the bells and whistles" that Ubuntu has. What "bells and whistles" are these that you can find in Ubuntu? --Llaanngg (talk) 19:34, 18 March 2016 (UTC)[reply]

The most visible item is, of course, Unity (user interface) - present in Ubuntu and decidedly not present in Debian.
A lot of the other items may be a bit more difficult to discuss unless you are a linux programmer. Take a look at package sources or browse the complete package listing to see what came from Debian, and what else came from elsewhere.
Nimur (talk) 23:44, 18 March 2016 (UTC)[reply]

Video game programming in the 1980s[edit]

I recently purchased a used copy of Shadow of the Beast, fully original. What struck me was the introduction of the programmers themselves, at the end of the manual. They were only 20 and 21 years old at the time (1989, meaning they were born in the late 1960s). They said they had started a computer science class at university, but soon dropped out to become commercial game programmers. Did they really do all this by themselves, learning not only Motorola 680000 assembly programming but also the Amiga's chipset and game development all by themselves? They had help from outside sources with the graphics and the music, but nevertheless, the programming task is formidable enough by itself. At that age, I myself had a good grasp of the basics of programming, but that was limited to a few hobbyist games only. The only assembler language I have ever learnt is that of the Commodore 64, and I'm even not fully proficient at that. On the other hand, I am a fully salaried full-time IT professional in Microsoft .NET C# programming, but that is beside the point. The technologies are vastly too different. What was life like as a professional video game programmer in the 1980s? How did people come to learn assembly programming? Were there any courses or were people self-taught? JIP | Talk 21:53, 18 March 2016 (UTC)[reply]

Just last week I was reading an archived copy of Byte Magazine, c. 1985, that had interviews with several of the top game programmers. I'll try digging up the exact issue and will report back with a link. Nimur (talk) 23:47, 18 March 2016 (UTC)[reply]
I don't think the programming in games like the Amiga version of Shadow of the Beast is very difficult. If you're given the graphics and the audio files - it's really not that hard. They probably had the benefit of existing code for playing sounds and positioning sprites - perhaps taken from earlier projects. Given those resources, I think I could write a game like that in a few months.
I graduated from the University of Kent at Canterbury in 1977 with a degree in Cybernetics - we were taught machine code programming on a PDP-11 and had to do one of our final projects in assembler. I had to teach myself how to do it on 8008, 8080. 8085, Z80, 6502, 6800, 68000, 8088 and 8086...but about which time, hardly anyone wrote assembler anymore. Once you know the basic principles of assembler - learning another CPU architecture is quite simple.
I wasn't a video game programmer in the 1980's - but I did write a bunch of Z80 games for TRS80 at home, in assembler - and at work, I did research into 3D graphics and worked on the team that made the first ever CD-ROM. But by the 1980's it was becoming increasingly rare to write anything large in assembler - mostly it was C or Pascal...then only C, then only C++. But a video game like Shadow of the Beast was a pretty simple undertaking - and speed was still important - so I could imagine still using assembler for it.
I am currently a video game programmer - and have been for quite a while now. I honestly don't feel that the act of writing games has really changed because of the shift from assembler to high level languages.
The HUGE change is that people expect so much more depth from a modern PC/Console game. When you only needed the player to run left or right, jump/crouch/attack - and the bad guys did nothing much more than to run towards you or hang back - and the graphics were simple 2D sprite animations - life was VERY easy. One good programmer could easily write an entire game in a few months.
These days, games are mostly in 3D - we need music that adjusts to the play, characters with actual AI that plan attacks against you - there is a story, you can customize your character - the sheer volume of 3D art is overwhelming. We have multiplayer games which operate across the Internet and require most of the gameplay software to be in a server someplace. It takes a team of between a hundred and five hundred people several years to produce a triple-A title...and even then, they are probably relying on lots of "middleware" to take care of graphics, physics, AI and such.
But - if you're writing a game like "Angry Birds" for a phone - you can still do it in a month with one programmer and one artist...and a block-buster like 2048 (video game) can be written by a 19 year old in one weekend. (I could get that one knocked out in a day!)
SteveBaker (talk) 17:47, 19 March 2016 (UTC)[reply]
"But by the 1980's it was becoming increasingly rare to write anything large in assembler..." This might be somewhat accurate when talking about "home computers", but most console games until the fifth generation, well into the '90s, were written in assembler. (This includes the Game Boy, which didn't see a full successor system until 2001.) Of course this is in part because the power and complexity of console hardware has tended to lag behind that of home computers, for cost reasons. --71.110.8.102 (talk) 02:40, 20 March 2016 (UTC)[reply]
Nearly all of them were career changers. Some were physicists, some were mathematicians, some programmers were former craftsmen, who liked to play with virual mechanic stuff, using it nearly unlimitted but memory only was a rare good. 3D was made by changeing socalled Sprites, the C64 supported a minor number in its video unit. Many games were programmed in assembler due no wasting of performance or memory. CPUs were just 1 MHz and RAM was 64 Kb only. --Hans Haase (有问题吗) 10:45, 20 March 2016 (UTC)[reply]
I know about the features and limitations of 8-bit and 16-bit computers in the 1980s. I was there, remember? I just wasn't anywhere near proficient enough in assembly programming to have made a commercial game of my own. I was just amazed that someone so young, especially with so little formal training, could just have gone ahead and made a commercial game straight away. JIP | Talk 20:43, 20 March 2016 (UTC)[reply]
You might find a book like Masters of Doom interesting. John Carmack was writing commercial games by the time he was 19, and wrote Doom at age 23. These pioneering "kids" had autistic levels of obsession, literally spending days straight writing code, fueled on not much more than pizza and mountain dew ;) Vespine (talk) 01:08, 21 March 2016 (UTC)[reply]
There is no doubt that the kinds of people who can (and will) do this are a bit "special". I have Asperger syndrome (which has recently been rolled into "Austism" by the professional diagnosticians) - and I'm left-handed. Over the decades, these things have interested me - and I've maintained an informal count of co-workers with similar conditions. In the general population, about 10% of people are left-handed. My co-workers are about 50% left-handed. Figures for Asperger's are harder to estimate - maybe one in a thousand people in the world have it. I'd estimate that maybe 20% to 30% of my co-workers have it to some degree - and about 5% of them - like me - have bothered to get an actual diagnosis. Left-handedness is correlated with good spatial awareness - so it's no surprise that we lefties dominate the computer graphics business. I see fewer lefties in (for example) AI and Audio programming disciplines than in my own field of 3D graphics. Asperger's is tied to the ability to concentrate on one very small area of interest for long periods without getting bored, and a willingness (indeed eagerness) to avoid direct social communication - and that fits the picture too.
When my wife goes to company events and meets my co-workers, she comes away saying things like "Oh my god! They're all just like you!"...like this is a surprise!  :-)
No matter how you look at those numbers - there is clear evidence that people who are passionate about (and good at) writing video games are not "normal" people. Our brains work differently. Certainly, when I was younger, I'd get so deep into writing something, that I've go for two or three days on not much more than cat-naps, soda and junk food until the 'itch' was scratched. As I've aged, that's morphed into longer periods, with more sleep and better nutrition - but the obsession is still there.
SteveBaker (talk) 14:40, 21 March 2016 (UTC)[reply]
Lore has it that mathematicians skew toward sinister as well, but the claims often stop at the level of anecdote: e.g. "There are a ton of lefties in my geometric topology class". Here's some real science articles on the topic in case anyone is interested [1] [2] [3]. That last one has empirical distributions of nine professions, including mathematician but not software developer. I've posted a table from the article here [4], which shows that architect is the profession that most lefty, and more than the population average. SemanticMantis (talk) 15:33, 21 March 2016 (UTC)[reply]
I agree that in "computer programmers" in general, the "lefty" effect is much less noticeable - but in the somewhat specialized sub-field of "3D computer graphics programmers" - it's exceedingly noticeable. I was once team lead of a group of seven graphics guys - and five of them were lefties. I was aware that architects were the most extreme leftist occupation in those studies. They have to solve the same kinds of spatial thinking problems that 3D graphics engineers need - and there is quite a bit of overlap between those occupations. I'm a computer graphics engineer - and in my spare time, I design model buildings, ships and other vehicles for my wife's business...so I guess architecture is a 'thing' for me too! SteveBaker (talk) 20:10, 21 March 2016 (UTC)[reply]