In this method reading a big data from a text file it's lines about milionen and it takes between 6 until 7 min to read and filter that . so is there anyway to make it more faster ?
- private static List<WebShopDataAccess> GetWebShopDataAccesses(string path)
- {
- List<WebShopDataAccess> elements = new List<WebShopDataAccess>();
-
- List<string> lines = File.ReadAllLines(path).ToList();
- var filterIPs = lines.Where(x => x.Contains("google")).ToList();
-
- Parallel.ForEach(filterIPs, line =>
- {
-
-
- });
-
- return elements;
- }