Welcome to Software Development on Codidact!
Will you help us build our independent community of developers helping developers? We're small and trying to grow. We welcome questions about all aspects of software development, from design to code to QA and more. Got questions? Got answers? Got code you'd like someone to review? Please join us.
Post History
This is solved in 2 steps: Find rows matching remove conditions Do anti-left-join on the the composite key (Die, Cell) To filter out the rows: # Read in the data df = pd.read_csv("data.csv...
Answer
#1: Initial revision
This is solved in 2 steps: * Find rows matching remove conditions * Do anti-left-join on the the composite key `(Die, Cell)` To filter out the rows: ``` # Read in the data df = pd.read_csv("data.csv") # Identify cells with low current b100 = df[(df["Resistance"] < 100000) & (df["Current"] == 100)] # Pull out only columns of interest bad_cells = b100[["Die", "Cell"]] ``` We can take unique values of `bad_cells` here, but I didn't, because it doesn't affect the outcome in the end. Also, taking the column subset is not necessary, but reduces extra junk columns later. Pandas has no explicit anti-join ("join on *doesn't equal*") so I stole it from https://stackoverflow.com/a/55543744/21703684: ``` # Remove them with anti join outer_join = df.merge(bad_cells, how="left", indicator=True) filtered = outer_join[outer_join._merge != "both"].drop("_merge", axis=1) print(filtered) ```