It's generally a consensus that a i5-6600 can be used with any graphics card, but I've seen some information to the contrary, especially with newer game titles (the witcher 3 and star wars battlegrounds) where the i5 would actually top out at 100% and leave the graphics card dipping below 100%. Any thoughts? Is a i7 really required in order to allow the graphics card to open up? i7's have worse price/performance.
Generally, a half-decent CPU is plenty for a video game because the vast majority aren't programmed to use more than 4 cores (at most) and that the clock speed isn't indicative of total system performance. Skylake is the newest CPU core out there, and if some game runs like crap on that, it's going to be from programming, not the CPU. In any case, video games that can have more intensive graphics settings are user adjustable including the resolution and detail settings. Now, the integrated graphics in Skylake may not be up to snuff in AAA titles, but they're not necessarily intended to replace a discreet solution.