Search the Community
Showing results for tags 'research'.
-
Original article: http://www.ox.ac.uk/news/2016-05-05-oxford-study-finds-virtual-reality-can-help-treat-severe-paranoia I am more than certain it is not the first time VR is sued in research, but definitely an interesting development. Also, VR didn’t help Jesse Cox with his fear of sharks and the Morpheus deo he had seen couple of months ago Any idea on what next could be researched using VR? I have couple already J
- 8 replies
-
- virtual reality
- oxford
-
(and 3 more)
Tagged with:
-
EurekAlert reports that Engineering researchers at the Columbia University if Engineering and Applied Science have discovered a way to have a WiFi chip transmit and receive on the same frequency on a single antenna simultaneously, massively increasing the throughput capacity. I think this is great. It can only mean further improvements in wireless technology and the researchers at the university have something to be proud of. EE Times say that funding is being provided so that carriers are encouraged to adopt the technology so that it can be implemented more quickly and hopefully be put into effect relatively cheaply. Interesting stuff. 1. http://www.eurekalert.org/pub_releases/2016-04/cuso-wcd041316.php 2. http://www.eetimes.com/document.asp?doc_id=1329466
- 1 reply
-
- wifi
- connectivity
-
(and 4 more)
Tagged with:
-
I'm in a bit of an odd situation. there is a job i want on campus that requires that i know how to program certain tasks using the ancient ztree and zleaf software, and i dont really know how to do that. can anyone point me to a site or other reference that i can use in order to learn the skills i need?
- 4 replies
-
- programming
- research
-
(and 2 more)
Tagged with:
-
Australian researchers at the University Of New South Wales have announce a new breakthrough that could bring viable quantum computing closer to reality, they have developed a quantum programming language that can take advantage of the quantum effect in quantum chips. Professor Andrea Morello, project leader, told The Sydney Morning Herald, "The advancement proves that we can write the most unique type of computer code that a quantum computer can support," "In other words, we have now shown that we can access the full 'quantum vocabulary' He further stated, "Things become more interesting when you have two quantum bits, like in the case of our experiment: there you can write '00', '01', '10','11' [like in an ordinary computer], but also combinations of them, like '00 + 11', '01 + 10', and so on," Morello says. It's not all good news however. Professor Andrew Greentree of RMIT University told Fairfax Media that this is by no means a silver bullet, "They've looked at entanglement between two spins on the same atom: one the spin of phosphorus nucleus, and the other the spin of a lone electron attached to the atom," he said. "This is not scalable, because the phosphorus really doesn't have any more spins. What they need is to entangle one phosphorus with the next phosphorus atom," Regardless, this is still viewed as an important step to bring quantum computing closer to reality. Source (Sydney Morning Herald): http://www.smh.com.au/technology/sci-tech/unsw-researchers-make-another-quantum-computing-breakthrough-20151116-gkzv4b.html
-
I know this is not the site to ask a question like this but this is the off-topic thread. Thank You. I have a science project on some kind of chemical (free to choose) that might impact the human body, For example: the effects of Cola on teeth; or the effects of drinking to much energy drinks that might cause some type of disease. I need help on choosing a topic that is relevant and that i can thoroughly explain and provide evidence on. Thanks for any help!
-
"5D Data Storage by Ultrafast Laser Nanostructuring in Glass" is paper published by Zhang,Gecevicius, Beresna and Kazansky from the Optoelectronics Research Centre from the University of Southampton. They claim to have achieved a data density of 360TB per glass disc and a thermal stability up to 1000 deg. C. The data is stored using a 3D data-layout (think minecraft for bits) and using polarization as the 4th and the strength of the burned point as the 5th dimension. But : The Paper is full of buzzwords and vague in a few parts but it seems like a new research direction has been kicked off for the big storage companies. Link to the Paper: http://www.orc.soton.ac.uk/fileadmin/downloads/5D_Data_Storage_by_Ultrafast_Laser_Nanostructuring_in_Glass.pdf
-
Right for my school Citizenship GCSE coursework I need to research and present a charity, I have chosen the Cytic Fibrosis Trust. I need you guys to answer a survey for me so I can gather a wide range of results from all age groups and genders! Please help if you have some spare time =D https://www.surveymonkey.com/r/J87WFRJ Thanks a lot!
- 6 replies
-
- graphics card
- help
-
(and 8 more)
Tagged with:
-
Hey guys, can anyone explain to me why we have so little cache per processor? Would being able to double or triple the amount of cache for a processor greatly increase performance, or would it go largely unnoticed? I do research in semiconductor physics, and I want to learn more about processor cache materials to see if I can start up some nanostructure semiconductor research to improve cache costs/size. Anything you know about cache would be really helpful! The wiki page on it is kinda vague.
- 3 replies
-
- cache
- material science
-
(and 8 more)
Tagged with:
-
http://45drives.blogspot.ca/2015/06/power-draw-of-enterprise-class-hard.html In part three of our series on power a member of our R&D team looks into the difference between enterprise and consumer drives with regard to their power draw.
- 4 replies
-
- storinator
- enterprise
-
(and 5 more)
Tagged with:
-
I need some software developers to help me out on my survey regarding Requirements Engineering and requirements elicitation. The link to the survey - https://docs.google.com/forms/d/1QLS2IdrnAcSjX-z2Ev3wJgFE1XemklFMkGoUMUCx5a0/viewform?usp=send_form This will take about 20min. I know it's long but I really, really need the help!
-
So I'll start this off by saying: I'm already a bit of a veteran when it comes to building PC rigs. I've made several of my own, I've made other people's, I've helped other people make theirs, blah-blah-blah. I know all the little things, I understand the hardware, so you don't need to explain the very basics. The reason I'm making a post on this forum, which is out of character for me anyway, is because I've built gaming rigs. This is completely new ground for me, different purpose, different type of hardware, different requirements. So, I got into computer science and I'm starting projects that need a little bit more than my desktop. Or at least a little bit different. Right now, the very first thing on my mind and only one of the things that I know I'll use it for, is calculating optimal solutions to permutation groups with brute force. I'm not an idiot, I know that the best personal supercomputer an unlimited budget can buy is still going to suck at that, that's the point. I can't use my desktop because, one, it's my desktop, and two, it's a gaming rig, which is a completely different type of powerful. So I'm going to use this rig for massive computing at heavy loads over long periods of time, I'm going to be using it for simulations, and any experiment or test I need. I know all the hardware that's out there, I know what's special and not so special about workstation cards and gaming cards, I understand the specs and performance and even the circuitry. What I'm clueless about is brand, series, price, so-on. So after looking at budget AMD cards, Nvidia gaming and Quadro cards, Intel Xeon and AMD Vishera CPUs, Intel gaming processors, desktop and server motherboards, server cases, desktop cases, tech stations, I'm a bit overwhelmed. So here's where I stop rambling and start asking questions: Budget: I've decided on an approximate $5,000 base to $10,000 ceiling budget with a little leeway depending on the parts. I'm a little open for this. I plan on building it over time, I don't have a stack of money sitting here labeled "PC Parts", and since I don't, you can probably guess what I mean when I say reasonable. Part of the reason I can't tell you what exactly I mean by "reasonable" is because I don't actually have a good grasp of what a lot of the more professional components are actually worth, but if you can help me then that necessitates that you DO have a grasp, so just use your best judgement. Case: I've been considering heavily the DimasTech Easy V3.0 tech station for this, but I'm not sure. I know I want a large open air bench because I prefer horizontal motherboards, I plan on fiddling with it a lot, I'm going to want it open for maintenance, I want to have a lot of room for everything, I'm going to implement water cooling for that extra security/stability/longevity (Especially the cards), and I'm going to want a very versatile chassis for anything I want to do with it for any particular project. So, test bench. Problem is, I don't know which one. The one I mentioned above looks great, but I'm not sure if there's anything better. It has to be sturdy, open, and versatile. Must be metal, not plastic or wood. Last thing is, while this machine is a tool more than anything, I still can't have it be ugly. Something that looks nice, like the DimasTech bench. If that's the one you'd recommend then yay me, if not than which one and why? Motherboard: This one is really up in the air. I want it to be big and have plenty of slots and ports and features, but I can't get away from looking at them from the perspective of a gaming rig. This also kind of ties in with the processor as well, as I can't decide between desktop, workstation, and server. I can see the benefit of multiple CPUs for a lot of possible uses, but then I can see why it would be a hassle for others and the boards don't seem very diverse, appealing, or even very open. Workstations are a bit in the middle I know, so if you have a recommendation tell me, and as for desktops... Single CPU, more limited RAM, and a far more consumer-oriented selection, not so much tools as they are toys, except for a select few. Knowing my purpose and that my budget has a limit, what would you recommend? Power Supply: Again, I build gaming and enthusiast rigs. I immediately think, since I like Corsair, "AX1500i". Then again, I know there are far less attractive looking, yet far more powerful, stable, reliable, and versatile PSUs out there for more business-oriented applications. I just don't know them. Any ideas? CPU: I'm having trouble balancing AMD/Intel and desktop/workstation. Intel chips are inherently more powerful and they've got the Xeon line of workstation cores, which I'm not sure if I should go for or not over the high end desktop CPUs that will perform better to a degree but don't have a lot of the bells and whistles that the Xeons have, and vice versa. AMD chips I know are cheaper and have little quirks, like their funky 8-core processors that are actually more like 8/2+4/2 core processors, but they have near magical overclocking capabilities that can make them very useful for certain things. I've got an idea here, but I'm still having trouble reading the balance. There are three trays- AMD, Intel desktop, and Intel Xeon, and I can't figure out which one's heavier. RAM: This one is going to be easier. Things I know will apply heavily to this rig: Reliability trumps, more=better, speed counts too. I just don't know what would be best though. I know I don't need gaming RAM, it's a little different, but I do need something that I can't quite define. Whatever you think is best, I'm sure I'll think so as well. GPU: Among the most important in this rig, as I'll be running big calculations, and generally a lot of them. I've got a good hold on the gaming card hierarchy, but workstation cards confuse me. I know what's different, I just don't know which one to get, I know know if the price is worth it, I don't know a lot about the different models and series. Most of this is I just don't have experience with the cards, so I just don't know where to start. I know that big points about workstation cards are reliability and precision, but I can't decide whether to get workstation cards for those point or to get cheaper gaming cards with vsatly similar hardware and tinker around with them a bit to get a lot of the same features working on them for less money. I know I'm going to have a multi-GPU system, I'll probably want Nvidia since their cards are just stomping on AMD's in terms of performance right now, and I'm going to put water blocks on them. Drives: I'm probably going to go for WD enterprise drives. They seems to be a standard. I'll most likely have a couple SSDs, which would most likely be Intel but if you have any advice~ The SSDs are in there mostly because I like keeping system separate from storage and because, while it is necessarily going to be a desktop machine where I'll need an operating system to be snappy or whatnot, SSDs can speed up processes a lot of the time, so... Yeah. Basically, if you think I should do anything different, I'd appreciate the info. Cooling: I have good experience with Corsair's closed loops, depending on radiator mounting I'll choose from their selection. I'm going to do a custom loop for the GPUs, so recommendations for parts general tips would be helpful. So Basically... I have a very vague laundry list and I need a bit of guidance, since I've never built anything like this before. I feel silly saying "I'm building a machine for computer science research", and asking people the most basic questions about what parts I should use. I'm hoping for this to be very powerful and capable of consistent operation, very open and versatile, very serviceable, very expandable, and I'm hoping I can make such a machine look nice. Oh, and not make my wallet cheat on me. This is going to be an accumulating build, and it's going to be a first for me. Any input is helpful. Unless it's not really related to hardware recommendations, tips, or whatnot, then it really isn't. I'll read over this later and most likely edit the bajeezus out of it, because it's 3 in the morning and my mind is betraying me. Any ambiguity, paragraph-long typo, or stupid stupid stupid sentence will most likely be ironed out. That is, if I can edit posts after I post them on here... ... Of course I can, it's a forum. I'm tired. Thank you so much if you managed to read through this mess, and thank you even more if you have good information for me. I'm tired.
- 8 replies
-
- research
- workstation
-
(and 4 more)
Tagged with:
-
I'm researching the topic of pirating, torrenting, and how they affect the Entertainment Industry. Pirating has become much much easier in recent years, and congress has done many things to try and stop it. SOPA is one. They explain how pirating music and movies costs the U.S. economy over $200 Billion a year, and over 750,000 jobs. Clearly it's getting out of hand and needs to be stopped. It's safe to assume that during the time that thepiratebay was offline, pirating decreased. Temporarily. These numbers, especially the jobs, are frightening. How accurate are they though? Mark Twain said "There are three kinds of lies: lies, damned lies, and statistics." These numbers aren't tangible. They are accurately estimated, yes, by the potential revenue. That's an entire topic that can be talked about for 10 pages. The number of jobs, let's see, maybe those are potential jobs too. What if it includes the number of jobs lost obliquely, or even for completely unrelated reason? These kinds of things aren't talked about enough. For instance, the potential revenue lost from pirating will most likely be spent elsewhere, and would never have been spent on what was pirated in the first place. If I go to the store, see a movie that I want, then go home and pirate it, that's lost potential. If I wake up, look at the Top 100 movie torrents, and just start downloading whatever sounds cool, there was no potential because I wouldn't pay to see those movies anyway. Tell me what you guys think and take a vote, these stats will be used in my presentation. Anonymously of course. All of this seems to only apply to big budget movies, hot 100 songs, etc. What about smaller movies? Well, there's another debate. For instance, if you have a independent film, and it's pay-to-rent online, how many people are going to watch it? Case studies have shown that allowing the free distribution of things like movies and songs greatly increases the popularity and is in a sense, a marketing tactic.
- 77 replies
-
- torrenting
- pirating
-
(and 2 more)
Tagged with:
-
Hey guys! I'm a fellow high school student who had to discover a topic that i hold interest in and develop a small research paper for it. I just wanted to ask you guys my question that I formulated and read your opinions and/or ideas. My question that i formulated(which you guys should be familiar with): To what extent does Moore's Law predict the evolution of computing? To answer this question i will be looking into the history of computing and using some computer science to formulate my assertion. REMEMBER i have to use history and computer science together to create the assertion. Therefore, i will look at the data of integrated circuits throughout time and elaborate upon subjects like quantum tunneling when the integrated circuits get too small. Also i ask that your input is valid and if you have sources that would be great. Thank you guys. If you wish to help me privately, email me at jerkingman@gmail.com (Its an old email... i use to do keep myself anonymous... please don't judge me)
-
Edit: Changed the Title to something more appropriate. Edit 2: Turns out this article is well over a year old, It turns out that OC3D had reposed an old article: Link to another source provided by @RedSphyxis - HERE With the release date of Batman: Arkham Knight drawing closer, I have been looking at more and more Batman news, both real and fake. Overclock3D recently published an article from the folks over at MoneySuperMarket, who decided to price up how much they thought it would cost to be Batman, with all of the armor and tech gadgets (assuming they exist), from The Dark Knight film series. Note: This does not include the wealth of Wayne Enterprises. Admittedly, I don't know whe
-
So that basic witchcraft on the new MacBook that lets you have a solid piece of aluminum function like a proper trackpad that clicks without having any movement is (ideally) coming to Apples own keyboards. Why? Who knows. A laptop keyboard that has no mechanism can help you shave even more off the thickness (Apple loves them thin things) but in a more serious way, it can probably solve a big problem with thinner keyboards - crappy response. I don't know about you guys, but a lot of laptop keyboards just don't have enough feedback. Having it all controlled through haptic feedback and letting you even configure what response a key gives when you press it? Very cool. Hopefully this escape the typical "we did it for the sake of patenting" and makes it into proper production. http://techcrunch.com/2015/03/19/apple-researching-taptic-feedback-for-keyboards-with-no-physical-keys/
-
Hello! I may be asking questions a lot in the forums, (maybe a few every month) and would greatly thank anyone who shares knowledge and links to research and study into. I've currently been looking into security and networking for the past few months and I've been extremely interested into a journey of studying while at ITEC college. Although I love researching into as much as I find and learning, I also like to ask questions (may be stupid ones too! ). I would like to ask if this is okay to ask as many questions on my mind in the forums, maybe it'd be easier to PM people instead of filling up the forums, or asking questions on this topic. I've been looking into some qualifications to further my studies and was told from my college teacher to look at CCENT and furthering onto CCNA. (Also CompTIA Networking, currently looking into CompTIA A+ on Wednesdays with another teacher in the college, although not gaining qualifications, he said he is willing to teach people who are interested). So, I do enjoy reading books, I personally find myself extremely distracted so I like being away from my computer if I try to study for more than 3+ hours. I'd like your opinion on any books/pdfs I can look into and grab on my studies. Any people that are interested in talking and sharing knowledge/answering questions would be a great welcome to PM.
-
An assignment to research this topic to find what you can about it
-
Hey all, I'm currently researching new semiconductor materials for (eventual) use in computer and related technology (additional applications include solar cells, LEDs, optoelectronics, etc). However, my knowledge of computers is limited to what an entry level computer technician would know; I can build computers, work with peripherals, and explain how different parts of a computer work about 1/2 as well as Linus can, but that isn't sufficient for my work. What I want to know is, does anyone here bridge the gap on that knowledge base? Going from fundamental semiconductor knowledge to computer semiconductor technology? I really feel like I could improve my direction in research materials science if I knew how my semiconductors applied to RAM/Cache/bus speeds/processors/etc. Because my eventual goal would be to take something like the current cost prohibitive cache systems and make them as cheap as standard RAM (not entirely possible, but it can be much closer than it is). However, it'd be nice to hear someone say that it's actually worth the time. So do you think it is? If I could make a material that made cache 5-10 times cheaper, would we dump that much more memory on processors? Or is there another reason in addition to cost that it's prohibitive to just drop memory on the processor directly? I'm just as happy to hear that it's not going to make a significant improvement in tech, because that means I can move on and spend less time on the topic. But in general, I'd like to start some materials science talks here on what materials go into each computer component and why they're selected! I think that would be an awesome topic here. Cheers! A physics grad student
-
Before i start posting my theory i want to say that my "theory" is purely a speculation , and it's only somewhat viable if everything stays the same (no major wars no technical difficulties ) and everything moves smoothly. Thinking classically , for the real time brain simulation in your pocket , it would take till ~2080. But i think realistically it would take x amount of years , until we reach a manufacturing process which is smaller by a whole magnitude as of today . For example not 22nm but 22 pm (picometer). What dou you guys think , how much years do you think we need for such a computer to be in our pocket ?
-
Folding for android For the first time, anyone with access to an Android-based mobile device can help scientists perform the critical research that is needed to find a cure for the ebola epedemic,The Project is courtesy of IBM & Scripps Research Institute & is being called "Outsmart Ebola Together" The software used for screenings in the Outsmart Ebola Together project is called AutoDock and AutoDock VINA, developed by the Olson laboratory at TSRI. IBM's World Community Grid has successfully run other projects that search for drug candidates for both high and low-profile diseases such as AIDS, cancer, malaria, Dengue fever, and influenza. It has enabled multiple breakthroughs, such as helping the Chiba Cancer Center in Japan discover seven new drug candidates to fight childhood neuroblastoma. The project basis is not a new terminology for us, the basic overall procedure is the same as Folding @ Home, except this would be the first time it's been introduced on to a low power processor such as a mobile device, This citizen science effort is possible through a partnership with IBM's World Community Grid, which has been making similar data-driven health and sustainability initiatives possible for 10 years as a free, philanthropic service to the science community. As of now anyone can download a safe and free app that will put their devices to work when the machines would otherwise be idle. With their collective processing power, the computers will form a virtual supercomputer to help The Scripps Research Institute (TSRI) screen millions of chemical compounds to identify new drug leads for treating Ebola. Meanwhile, the devices will remain fully available for normal use by their owners. The Scripps Research Institute(TSRI)also invites members of the public to support Dr. Saphire’s crowdfunding campaign at www.crowdrise.com/CUREEBOLA to secure resources needed to analyze the enormous volume of data generated by Outsmart Ebola Together. Crowdsourcing this citizen science effort will dramatically accelerate the process of identifying a cure. if you register your tablet, smartphone or computer, the system will add you to the list of 680,000 volunteers and nearly 3 million devices. a sensational news , Do post your thoughts & comments down below.. & to get started you need to register at this: http://www.worldcommunitygrid.org/ give proper details later download the App (for mobile/tablet & PC) Then carry on with application install & the rest will be done by the app in th background Original Link: http://www-03.ibm.com/press/us/en/pressrelease/45594.wss Other links: www.crowdrise.com/CUREEBOLA http://www.scripps.edu/ http://www.citizenibm.com/ http://www.worldcommunitygrid.org/ News Link: http://www.engadget.com/2014/12/19/ebola-scripps-ibm-world-community-grid/
-
NASA has already invested $125 thousand in a 3D printer that would be used by astronauts in zero gravity to make pizza & is expecting a prototype in 6 months apparently. But Ajan Contractor, the brilliant mechanical engineer with a background in 3D printing behind this, envisions a much more mundane and ultimately more important use for the technology. He sees a day when every kitchen has a 3D printer, and the earth’s 12 billion people feed themselves customized, nutritionally-appropriate meals synthesized one layer at a time, from cartridges of powder and oils they buy at the corner grocery store. Contractor’s vision would mean the end of food waste, because the powder his system will use is shelf-stable for up to 30 years, so that each cartridge, whether it contains sugars, complex carbohydrates, protein or some other basic building block, would be fully exhausted before being returned to the store. Contractor says : Ubiquitous food synthesizers would also create new ways of producing the basic calories on which we all rely. Since a powder is a powder, the inputs could be anything that contain the right organic molecules. We already know that eating meat is environmentally unsustainable, so why not get all our protein from insects? If eating something spat out by the same kind of 3D printers that are currently being used to make everything from jet engine parts to fine art doesn’t sound too appetizing, that’s only because you can currently afford the good stuff, says Contractor. That might not be the case once the world’s population reaches its peak size, probably sometime near the end of this century. Chocolate printing trial :
-
Hello forum! I see a lot of threads about which case is best or best bang for your buck, but I have a different question in mind and maybe it will help others out in someway. What do you guys look for in a case? What do you research when building a new PC or upgrading from a previous one? Do you normally stick with one manufacturer or go for whatever is good? The reason I asked is being so many good cases are out; the H440, 760T, 250D, etc. love to hear your guy's opinions!
-
Official LTT BOINC FAQ & Guide NOTE: Still a work in progress for now. As the old BOINC FAQ was a bit of a mess, we have decided to replace it with a new one. Massive thanks to @tobben, as well as @Liquidus, @Brainiac777 and @Patramix for their contributions. Also thanks to @Me1z for notifying me of some broken links. Unlike the old thread, this one will not serve for discussion, if you have a question or issue, just create a new thread in this forum section, that's what it's for after all. Also, the target audience for this, at least for the time being, will mostly be beginners. What is BOINC? Boinc is a collection of various distributed computing projects, it gives you the opportunity to lend your unused processing power to do calculations in the name of science and progress! This can be to help medical science, physics, mathematics, better understand nature, or pretty much anything you can find a corresponding project to, it's up to you to decide what you want your computing horsepower to be used for, this gives scientists the chance to gather big ammounts of data by giving them access to hundreds or thousands of computers, all of this is achieved by running some software on your computer, there is also software avaliable for smartphones. Getting Started Well, first you need the program, don't you? The BOINC binaries can be downloaded from BOINC's website here. Alternatively, if you're running GNU/Linux, you probably already have BOINC available in your repositories and can it install from there. If your distro has a wiki/help page on it, I recommend reading that, since there can be some slight differences between distros with regards to setting everything up. Account Managers Vs. Local Configuration BOINC has something called account managers, for instance BAM! Their primary purpose is to consolidate administration over multiple projects and computers into one central location. It allows you to join projects from one single place, attach and detach computers to specific projects and set all kinds of different configurations from one without needing to go to every computer and configure it separately. You can read more about it on their website (see link above). For now, we'll be sticking to the other variant, which is configuring stuff locally, as it is a bit simpler for beginners. Joining Projects and a Team For each project in which you take part, you will have an account on the project website itself as well. Alternatively, if you're not running multiple machines or are a bit confused by account managers, you can also administer BOINC for each PC locally, which is what we'll cover here for the sake of beginner friendliness. Open the boinc manager, it should be the basic viewer by default, click "Add a project" (screenshots are from tobben, hence why not English, but it should still be understandable): If you wish to use an account manager, you can click "Use account manager" here, then enter the URL of your account manager of choice, your PC should then sync with the account manager. Alternatively, for not using an account manager, click "Add project" You will then be presented with a list of all available projects. This is probably where you'll spend quite a bit of time, figuring out what kind of research you wish to support. You can join multiple projects and have them active on your machine at the same time, BOINC will just distribute computing power and time among them. Pick the project you want, click "Next" and proceed to make an account. This will create an account for you on the home page for the project you choose. You will have to make a account for each of the different projects you want to join, if you already have an account on the corresponding poject click "Existing user" and log in. After you have done this it should take you to the web browser and give you some options to join a team and configure your settings. Joining Teams can be a bit tricky. You can join a different team for each project, so you need to join the team of your choice for each project separately. For some teams this can be done from within the BOINC manager program, but sometimes that doesn't really work correctly, and in those cases you need to go to the project's website, log in to your accout and set your team membership there. To join a team in the BOINC Manager Program, click on "View" and set it to "Advanced view". In the advanced view, click on the "Projects" tab, here, you will find a list of your projects. Click on one of your projects that you want to join the LinusTechTips_Team and on the left side under "Project web pages", click "Your account". If that doesn't work, you need to go via your account on the project website: Go to the community section on the right and find the field that says "Find a team" and click on it. Search for the LinusTechTips team and click on the result that is our team, LinusTechTips_Team from Canada, then click "Join this team" To confirm you're on the team, go back into the BOINC Manager, go to the projects tab in advanced view and see if it says "LinusTechTips_Team" under the Team header. If it doesn't, update the project. If it still doesn't work after that, something has gone wrong and you need to start troubleshooting. Some More BOINC Manager Info Most configuration in the BOINC manager as well as more detailed information will be done through the advanced view. For some configuration in the basic view, go to "Tools" -> "Settings". From the advanced viewer you can see your point statistics, see uploads/downloads the current projects you are working on etc. In the projects tab you can stop and start receiving work units for each individual project you are working on among other things, Something particularly useful in case of trouble is the event log, located in the "Advanced" menu at the bottom. Why isn't boinc doing anything? Often that is because BOINC by default is set to only work when your computer is in idle. You can change this in the "Activity" tab in the advanced viewer, select the options you wish. Alternatively, you can go to "Tools"->"Computing Preferences" in the advanced viewer, which will give you much more fine-grained control over when BOINC should do what. Using GPU and CPU for different Projects/ Editing Project Preferences To edit the project preferences you need to go to your account on the specific project you wish to configure. In the "preferences" section, select preferences for the project. Click "Edit preferences", from there you can set if you want to use a GPU or CPU if not both for the projects, how many resources you want to dedicate to the project and so on. Badges In order to be rewarded with the Boinc Team badge, please note the following requirements. (subject to change) Contributor - When you have contributed more than 25,000 credits, your will qualify for this badge, to show your commitment to LTT and the team. Bronze Contributor - When you have contributed more than 1 Million credits, your will qualify for this badge. This reflects your serious and continued support of the team. Silver Contributor - When you have contributed more than 25 Million credits, your will qualify for this badge. This reflects your serious and continued support of the team and your very generous donation of resources. Gold Contributor - When you have contributed more than 100 Million credits, your will qualify for this badge.. This reflects your dedication and continued support of the team and your very generous donation of resources. You are now a legend! That's it. Once you hit one of the milestones, PM your Boinc ID to one of the moderators, preferably Whiskers, who will update it in due time. Links Boinc home page http://boinc.berkeley.edu/ Stats Link: http://boincstats.com/en/stats/-1/team/detail/b85bbff1c6df413a4b44cfb82854273f/ GPU Project Lists: https://boinc.berkeley.edu/wiki/GPU_computing DC-Vault Link (site that links numerous project teams into one global score) http://www.dc-vault.com/showteam.php?team=547 The Old FAQ
-
- boinc
- distributed computing
-
(and 3 more)
Tagged with:
-
Image via: http://map.ipviking.com/ It should be no surprise to anyone that a massive number of network scans and penetration attempts originate from China. While taking a Network and Security course at Virginia Tech, our professor visualized all the attacks to our campus network overlaid onto Google Earth. A vast majority of the attacks came from roughly 3 places in China. The map lines originating from China were about 100 times thicker than from anywhere else, indicating the volume being 100 times higher from that IP/IP-block. Recently, Canadian officials claim that hackers working at the behest of the Chinese government gained access to Canada's NRC network -- not a surprising piece of news, considering the amount of cyber threats originating from China. The NRC, or National Research Council, is Canada's chief research and technology organization. While there is no evidence that points to any other networks of the Canadian government's infrastructure being infiltrated, the NRC network has now been isolated from the greater government system as a precaution. Tangential and peripheral (which often means less secured) networks are frequently used as an attack vector, allowing hackers to gain access to the larger juicier target they want to access. Image via www.asianews.it “Recently, the Government of Canada, through the work of the Communications Security Establishment, detected and confirmed a cyber intrusion on the IT infrastructure of the National Research Council of Canada by a highly sophisticated Chinese state-sponsored actor.” -- Corinne Charette, Canada’s Chief Information Officer The implications of this being a state sponsored attack as opposed to a hacker group or private corporation make this news story noteworthy. The Chinese government and industrial complex are known for having loose to blatantly nonexistent considerations for copyrights and patents. Clones of commercial products (from consumer electronics to knock-off Bentleys) and even military hardware-like drones are run-of-the-mill in China. So it makes sense that China would want to gain access to a nationally-backed research organization. Hopefully they didn't gain access to Canada's super-secret bagged milk super-soldier project! Source: https://community.we...57?sf29171718=1
-
Hi there, So I've got a big procrastination problem. Any way to motivate myself to work on stuff and stop getting distracted? Thanks in advance...
- 23 replies
-
- i should be writing a paper
- procrastination
-
(and 1 more)
Tagged with: